Posts

Tibbo Technology Announces new TIDE Release 5.03.03 that features TiOS Simulator

New Tibbo IDE (TIDE) release 5.03.03 includes a Tibbo OS (TiOS) Simulator. The Simulator implements a virtual TiOS device incorporating virtual Ethernet interface, virtual EEPROM, virtual flash memory, virtual MD button, buzzer, and status LEDs, as well as virtual LCD and keypad.

The Simulator makes it possible to test-drive TIDE and TiOS, as well as run and debug Tibbo BASIC and C applications, without having to commit to a purchase of a physical Tibbo device.

The Simulator can be found here: WINDOWS START > Tibbo > Tibbo IDE > TiOS Simulator. You can also start TiOS Simulator from within TIDE: Debug > Start TiOS Simulator.

Once the Simulator is running, it appears in the Device Explorer as any other TiOS device would. To write an app for the Simulator, select the SIMULATOR platform and set the Simulator as the debug target.

TIDE 5.03.03 is distributed with a number of test projects written specifically for TiOS Simulator. You can find them here: (My) Documents\TIDE\Samples.

www.blog.tibbo.com

The real cost of robotics

Courtesy of techcrunch.com

By Dmitry Slepov – CRUNCH NETWORK CONTRIBUTOR

Dmitry Slepov is the managing director and a co-founder of Tibbo Technology.

Before we begin, I feel compelled to make an important disclosure: I love robots! Robots are cool. To me, robots are cooler than people, who are only cool occasionally. I especially love industrial robots: They help us “make” things.

Now please allow me to make an obvious observation: Like me, many folks think robots in general, and industrial robots in particular, are very cool. Some of these folks hold high offices in various business enterprises, where they broadcast their love for robots into their working environments.

The media’s current infatuation with industrial robots and automated manufacturing has these guys whipped into a frenzy. I know how they feel, because I feel the same way. How else can you feel after watching a video showing the production of motors for Dyson vacuums? The video is sexy as hell — shiny machines executing a complex dance in perfect unison. It’s a city full of wonders, completely devoid of boring humans.

The conclusion is inescapable: Sir James Dyson must be the overlord of robots. That is, until you see the video on how Tesla Model S is made. Another professionally put-together report from the land of (almost) no workers.

There are many videos like that, and the media is rebroadcasting them with enthusiasm. This leads me to this question: Why the renewed interest? Robots were perfectly capable of automatically manufacturing complex things decades ago. Just look at industries with products that can’t be directly handled by humans: The semiconductor industry, where the product is too small for human hands, has been building vast automated facilities since the IC revolution began and the American auto industry started to fill factories with robots back in the 1970s.

Why is the media taking such an interest now? Because today, the robotics industry has a set of fresh economic and political messages:

  • Robots are becoming affordable.
  • Anyone can benefit from purchasing a robot.
  • Robots will increase our production efficiency.
  • Robots will allow us to “reshore” (run away from China).
  • We will be able to make things in our country again.
  • We will get rid of workers — they are just too expensive and too lazy and kids these days don’t want factory jobs anyway.

Although all of the above are true to some degree, the simplicity of media coverage distorts the real situation. After watching numerous videos showing cool automation in action, it would be easy for you to get the wrong idea about how much effort it takes to automate anything.

A lot of folks imagine the process of bringing a robot onto their production floor as, literally, bringing a robot in.

I will argue, based on my own and my peers’ experience, that a lot of folks imagine the process of bringing a robot onto their production floor as, literally, bringing a robot in. You buy a robotic arm, you install that robotic arm, you’re done.

It’s hard to blame them. Robotic manipulators are what they see in videos. If you hear the words “industrial robot,” what pops up in your mind? The arm! Get one or a couple of these and you are on the way to your company’s automated future. If only it was that simple!

Let’s look at what it takes to create a typical manufacturing cell that assembles something. We start at the moment when you decide to acquire a robot…

Because you know that you definitely need a robotic manipulator, you start your purchasing or your mental journey from getting that arm. So you buy an ABB, KUKA, Toshiba, EPSON or some other brand you saw at the robotics trade show you visited recently. Depending on the brand, your outlay for the arm is perhaps $30,000-60,000. Despite the high cost, that arm is literally… an arm. No torso. No wrist. No fingers. No eyes. And no brains (I’ll get to this later).

Next, you find out that you can’t simply install your robot on any desk. No. It must be a heavy-duty, purpose-built pedestal. These things have enormous weights; they are expensive, too — expect to spend several thousand dollars.

But wait, there’s more. Your robot needs a cage… unless it’s one of those new collaborative robots like UR-10 from Universal Robots. Because they are allowed near people, they move like yoga instructors, putting you to sleep in the process. If you care about doing things fast, you’ll buy a speedy robot — and it will have to be caged. The cage will need to come with some safety equipment, like an emergency stop button, safety sensors and so on. Chalk up several more thousand dollars for the cage and all that safety stuff.

Next, you’ll need to take care of something called an “end effector.” That is the part that attaches to the business end of your robotic arm and allows it to do useful things. End effectors vary, from grippers with fingers for holding things to vacuum heads to electric screwdrivers to an endless array of specialized contraptions. Chances are you won’t find any suitable end effector for your application, so someone will have to build you one. Budget a lot of money for this part of your project.

Human hands are extremely versatile and can do thousands of different jobs. Not end effectors. Your robot will probably have to be equipped with several end effectors for handling different production steps. This will involve the use of a so-called tool changer. It’s just like in Japanese cartoons. One moment that giant robot holds a bazooka, the next it’s a ray gun.

With a tool-changing system, like the one made by ATI-IA, your robot will be able to quickly change between, say, an electric screwdriver and a suction gripper. The bad news is that tool changers are so expensive that adding such a system will easily cost you around 30 percent of what you paid for your robotic arm.

Next, you’ll need to think about giving that arm of yours some ability to sense. Most robots don’t come with “force feedback.” They boldly go where you tell them to go, no matter how many things get smashed along the way. A typical robotic arm with a gripper is about as sensitive as a crab claw (no offense to crabs). A force-sensing accessory will solve that, to a degree, but it will also set you back several more thousand dollars.

Wait, you aren’t there yet. Now you need to think of a way to hold your “parts in production,” i.e. parts that your robot will be working on. Humans come with two hands. We can hold a screwdriver in one hand and secure the part on which we are working in the other. Try doing any kind of assembly using just one hand. You won’t get very far. Well, that’s the situation your robot will be in, constantly, unless it’s one of those cute two-armed ABB Yumi robots (there is nothing cute about their price; you can buy two or three one-armed servants for the price of one Yumi).

So, in order to hold your “parts in production” in place, you’ll need to come up with fixtures and contraptions that are unique to whatever it is you are manufacturing. There are many ways to do this stuff. For example, my company supplies a construction system called UniQb. You can quickly build one-off fixtures and rigs using its “beams.” This part of your project may not be very expensive (in comparison to everything else), but it will consume quite a bit of time.

Alternatively, you can assign a worker who will service the robot while contemplating a philosophical question: “Who works for whom? Does the robot work for me or do I work for the robot?”

This step handled, you will need to think of how your robot will get the parts to work on and output the fruits of its labor. Robots can’t (yet) run to the warehouse and cart back a bunch of parts. You robot is like a master craftsman sitting in the middle of a studio. Everything must be brought to it. For small parts, such as screws, you will need to install “screw presenters” — machines that “offer” screws in the right orientation. Larger parts will need to come on conveyor belts or some other means of transportation. Alternatively, you can assign a worker who will service the robot while contemplating a philosophical question: “Who works for whom? Does the robot work for me or do I work for the robot?”

The next step is to equip your mechanical monster with eyes. With the exception of the aforementioned Yumi, which, in the appropriately mutant fashion features eyes on its hands, most robots arrive at your doorstep completely blind.

You will need to install a vision system consisting of one or more cameras and a processing unit. You will also need to arrange ideal lighting conditions: Cameras are not like human eyes. Too bright or too dark, and the system won’t work. Also, robots mostly see in 2D. There are some new 3D vision systems on the market, but these are still prohibitively expensive. A good vision system will cost you several thousand dollars and a lot of trial and error until you get it to work right.

Also, don’t forget about electric power and air supply. Many robots will require “industrial” power (not the power available “on tap” from your wall outlet). Your system will almost certainly use vacuum grippers or something else that requires “air.” Robots don’t come with compressors. You will need to buy and install one. More $$$ spent.

Are we there yet? Nope! All this extra stuff you now have around your robot will need to be hooked to a single control system that opens and closes valves, activates servos, senses the position of things and so on. Such jobs are typically accomplished with programmable logical controllers (PLCs) or embedded computers.

Last but not least: programming. This part is particularly fun. You will need to teach your robot how to do anything useful. Hello, disappointment. We all grew up watching Star Wars, so we automatically attribute some intelligence and magical powers to our mechanical helpers. Forget it. Robots are not smart. In fact, they are plain dumb. You will need to teach your robot literally every tiny little move. There is virtually no self-learning. Expect a lot of labor. You will be trying, adjusting and, when you thought you were done, you’ll find yourself coming back to adjust some more.

All these steps I’ve outlined require you to be a very skilled professional in a multitude of disciplines. Chances are that you aren’t — and even if you are, it’s unlikely you have time to deal with all this complexity. This is why you will probably hire an integration company to put the system together for you… for a price tag that is twice higher than the sum total of all the parts involved.

In the end, you will look at that robotic arm you started your journey with and realize that the arm is but a tiny part in the long list of equipment that had to be provisioned, installed and configured in the name of your automation project. You see now what it took Sir Dyson and Mr. Musk to fill their factories with hundreds of robots? They approved oceans of work, hundreds of thousands of hours of human planning and design and tens of millions of dollars in equipment costs.

And now for the worst part… Here it comes. Da-dah! These futuristic production lines you see on TV and YouTube are mostly built to handle just one product. Change the product, and you need to redesign your production line. You don’t just tell your robot to “stop doing this thing and start doing that thing right from tomorrow morning.” You start “retooling” — and retooling is expensive and time-consuming.

The U.S. auto industry with its futuristic robots learned this the hard way, while the Japanese (whom we firmly associate with robots) did not go overboard and simply stayed with lean production teams of human workers. Take heed! Before embarking on your automation journey, count how many years of human salaries you will be able to pay by NOT investing into your smart robotic manufacturing cell.

Have I just put you off robotics? I hope not! Like I said at the start of this conversation, I love robots! Robots are cool. There are many excellent reasons to use them. We humans are unpredictable and difficult, and as time goes by we become less and less inclined to take on factory jobs.

Automation is coming, and robots will eventually take over our production lines. I just want you to know that today’s real-life robots are nothing like what the media makes them out to be. Proceed with caution (and deep pockets).

FEATURED IMAGE: HIEROGRAPHIC/SHUTTERSTOCK

Unusual features you would not expect from a terminal emulator

I don’t know about you, but I love Lego, I really do. Back when I was a kid I actually didn’t have the original Lego set – my father was a fan of more advanced construction toys like Meccano and Erector, where you had to screw parts together with nuts and bolts. However, being a programmer – and I am one – you get to appreciate the beauty of the Lego concept

Building complex devices or systems by connecting simple blocks together is the essence of engineering, and you will be hard-pressed to find a better example of “complex from simple” than that offered by Lego. All the blocks are compatible, all the blocks connect to other blocks easily, you don’t need any nuts, bolts or tools, and only your imagination is the limit for what you can build. And boy, have you ever tried to *break* a Lego brick? I dare you to accomplish this!

When we design software, this got to be our ultimate goal. We want it to be like Lego. We want it to be built with small building blocks, each one as simple and robust as a Lego brick, each one interchangeable and easily interconnectable with other blocks..

Unfortunately, it is not always possible to achieve this goal, and even when it is, we often find excuses for not going all the way in Lego-ising our projects – after all, our customers (and most employers) usually only care about the end product, not the bricks it is built of.

I will argue, however, that investing time, money, and creative thinking into Lego-ising your software product may reward you with unanticipated product strengths and features. In support of this statement, allow me to present Exhibit A.

Exhibit A

The case in point is our software called IO Ninja – here is a link to my introductory article covering the origins of Ninja. To make the long story short, once upon a time we were looking for binary-enabled IO terminals and sniffers for a specific set of transports and protocols. After not being satisfied with what we found (check the link above to find out why), we ended up creating our own all-in-one low-level IO debugger. Somewhere along the way we’ve got a crazy idea of making it programmable, so our users could write their own protocol analyzers and testing scripts. This is the preamble to what I want to tell you.

These two design goals – to build an all-in-one product and to make it programmable – have essentially forced a Lego-like architecture upon us. So, this wasn’t even a choice; we couldn’t get away with the usual “the end product is what matters” mantra. In our case, the end product had to look a lot like Lego!

Here is what we ended up with: IO Ninja software consists of two large parts – the binary part and the script part. The binary part provides a framework of Lego bricks: sockets, serial ports, sniffers, buttons, combo boxes and so on. Scripts glue our Lego bricks into specialized IO sessions.

Now comes the interesting part: After creating a Lego-like design, we discovered that IO Ninja naturally has a plethora or interesting, useful, and unexpected features that were not anticipated even by us – the builders of this product!

Ability to redirect anything-to-anything

Users of Unix-like systems are well-aware of the possibility of redirecting the output of one program to the input of another. Session linking in IO Ninja provides similar functionality. The difference is this: instead of unidirectional data cascade of Unix pipes, session linking shorts the TX stream of one session onto the RX stream of another – and vice versa. After this shorting, IO Ninja keeps passing the data back and forth between the two sessions. That turns IO Ninja into a universal redirector.

Let’s say you have a device attached to your PC via a serial port, and you want to be able to connect to this device over TCP. That’s right. Connect to a serial device over TCP.

Launch IO Ninja and start a new serial session. Open and configure your serial port. Next, open a TCP Listener session, choose a TCP port to listen on, and start listening.

Now all is ready for you to reap the fruits of IO Ninja’s Lego-like architecture! Go Menu->Session->Link sessions. Voila! You can now connect to your serial device via TCP (notice the chain links on session tabs):

You can use the same approach for redirecting TCP-to-UDP, UDP-to-SSH, Named Pipe-to-TCP, and so on. As an extra bonus, all passing data will also be logged in the process, turning your setup into a universal man-in-the-middle sniffer.

Searching for devices using UDP broadcasts

A common way of auto-discovering devices on a local network segment is by broadcasting a UDP request packet and collecting replies from participating devices. This is how Tibbo’s software discovers our devices.

With IO Ninja’s modular architecture, we were able to take a Lego brick of UDP socket and plug it into a dedicated session specifically designed to handle one-to-many communications.

To try it, open a UDP Socket session, set the remote address to 255.255.255.255 (or to a subnet broadcast address like 192.168.1.255), then send an echo broadcast packet. As a result, you will see the list of devices connected to your local network segment. Now choose one of these devices, copy-paste its IP address and start communicating with this particular device!

Check out the compass button. When working with the UDP protocol, you often need to reuse the source address from which the most recent packet has arrived (after all, UDP is a connectionless protocol). Of course, nothing prevents you from copying an address from the log and pasting it into the remote address combo box, yet why do something manually when you can automate the task? Press the compass button, and the UDP Socket session will automatically readjust the remote address to the address from which the most recent packet was received.

Sniffing TCP data using a TCP proxy

IO Ninja includes a Network Sniffer session. This is a pcap-based plugin for monitoring network protocols. When running it, IO Ninja behaves much like the Wireshark software. Despite the prevalence of pcap-based software tools, the reality is that the pcap-based approach is not the only way to eavesdrop on network communications. In some situations, using a pcap-based sniffer is simply not possible, and in some situations, there are much better ways to tap into the data stream.

In addition to the traditional pcap-based sniffer, IO Ninja provides the second kind of a TCP sniffer: it is called a TCP Proxy, and it is a combination of two TCP sockets – a client one and a server one – passing data between each other.

Here is the benefit of this approach: instead of a packet-based log (with disjoint chunks of data you must mentally piece together), your TCP proxy session provides a clean log of data streams – just like you would see in a TCP connection session. Also, unlike with the pcap-based sniffing, which is only applicable to local networks, TCP proxy sessions allow you to monitor TCP links in all situations, as long as you can position yourself to be the man-in-the-middle.

Here is a screenshot of an FTP session being monitored with the TCP Proxy plugin:

Debugging named pipes in Windows

IO Ninja can be used to debug a very common IPC (inter-process communication) method in Windows called named pipes. For example, all Tibbo kernel-mode drivers use this method to communicate with user-mode services and configuration utilities.

Unfortunately, debugging named pipes can be pretty tough due to the lack of named pipe terminals and monitors. Before IO Ninja, you could not even send a packet over a named pipe without having to write a test application! Naturally, we were determined to fix that – IO Ninja can work as both the client- and the server-side terminal for Windows named pipe communications.

To use IO Ninja on the server side (the one that calls the ConnectNamedPipe function), launch the Named Pipe Listener session. This plugin allows you to accept named pipe connections originating from your application, driver, or service, much like the TCP listener plugin accepts incoming TCP connections. After accepting a connection, you can communicate with a client and analyze the log of received commands.

For emulating the other side of named pipe communications, use the Generic File session.

For monitoring named pipe communications you can employ the man-in-the-middle approach described earlier. To do so, start a Named Pipe Listener session, link it to a Generic File session and redirect your named pipe client to IO Ninja. In the future, we are planning to provide a dedicated standalone filesystem filter-based plugin for sniffing Windows named pipes. For the time being, the man-in-the-middle approach will suffice for most debugging scenarios.

Conclusion

Described above are just some of the many cool features stemming from the Lego-like architecture of IO Ninja. Allow me to repeat this: we haven’t even planned to implement some of those features – they simply appeared as a result of having a well-planned modular architecture. Make no mistake – applying this Lego principle to your project is a hard job. It requires a massive investment of time and effort, take my word on that. Still, going down this path is well worth considering. Who knows, maybe you will also discover a few unexpected and useful features magically appearing as a result of your Lego-like approach.

So, let’s Lego-ise!

http://tibbo.tumblr.com/

Portfolio Items