By: Rogério Loureiro, Manager – Digital

The term “IoT” is all the rage in the world of IT.

I’ve seen many posts and articles in various media covering the fact that, like many other trends, the IoT is becoming more and more common in our daily lives.

The first use of the term “IoT” has been attributed to MIT researcher Kevin Ashton who, in 1999, conceived a system wherein all physical “things” would be interconnected via the Internet, through electronic sensors scattered everywhere, and that they could interact among themselves and with people.

The IoT is not a technology, nor a set of technologies. It’s also not restricted to the use and application of RFID and/or electronic sensors. It is, in a broader sense, a concept. And, contrary to what many people believe – or post – it’s not a totally new concept. It’s an evolution of concepts that have existed for a while in industrial production.

In the late 70s and early 80s, manufacturers of integrated circuits, led by giants such as Intel, Motorola, and Texas Instruments, followed by smaller companies such as Zilog, invested heavily in the large-scale manufacture of microprocessors, which brought personal computers to life in an age that had been dominated for more than 20 years by mainframes. They were complex integrated circuits, with a high density of components and functionalities, which could replace the processing capacity of large computers on printed circuit plates that occupied only a few square centimeters. Among the most widely used microprocessors of the day were Intel’s 8080 family, Motorola’s 6800 family, Zilog’s Z80, and MOS Technology’s 6502. Companies such as IBM, Apple, Commodore, and Tandy Corporation created their personal computers using these processors.

Computers became more affordable, which led to industry adopting them in the automation of their manufacturing processes, initially in engineering offices.

With the increased demand for microcomputers, manufacturers (always with their eye on the movement of the market) created new components, dubbed microcontrollers. These were processors whose native layout contained built-in electronic ports for generic use, such as digital and analog entries and exits and serial interfaces.

Then, PLCs (Programmable Logic Controllers) arose; conceptually, these were microcomputers designed for industrial automation.

These came to replace the old control panels that were used in plants and controlled by a complex circuit of interconnected relays, buttons, and switches, which enabled the “programming” of repetitive mechanical tasks, which had been done manually, such as turning a conveyor on and off, or activating a stamping press based on a given process status.

The dissemination of microcontrollers enabled the creation of onboard systems, giving information processing capabilities to machines that had once only performed mechanical tasks. This technology gave rise to the so-called M2M (machine-to-machine) integration, wherein different pieces of equipment, with different functions, on a production line or in an industrial process, communicated among themselves to execute processes in an orderly and automated fashion. Microcontrollers have undergone several evolutions/generations, and today they are broadly used in the automation industry.

With the emergence and spread of the use of the Internet, the application of the industrial concept of M2M can be seen on the horizon in people’s daily lives, leading us to what Kevin Ashton called IoT – the Internet of Things. As can be seen, the IoT is a concept that has been remodeled to today’s technological and economic reality, wherein common machines, such as home appliances, smartphones, TVs, cars, industrial equipment, electronic equipment, and other “things” will be interconnected via the Internet and will have autonomy to “make decisions” based on data taken from the physical environment, through the use of electronic sensors, and to execute automated functions via actuators such as micromotors, electronic switches, or even robots.

In combination with the technologies we have at our disposal today, like nanotechnology (sensors and actuators), big data (data), and cloud computing (processing), the concept of IoT allows for infinite applications for all different areas of human knowledge. And with these applications will come opportunities to overcome the different challenges that will arise with their implementation.

Are you ready for this new world?