People sitting around a desk

adesso Blog

The rapid fluctuation of IT trends can often cause confusion for many companies and the buzzword bingo that comes with it could fill an entire dictionary on the latest topics each year. Here are a few of the trends we have seen over the past 15 years: portal solutions, ERP systems, big data, in-memory technologies, Internet of Things (IoT), digitalisation, artificial intelligence (AI), virtual reality (VR) and augmented reality (AR). However, using the connection between IoT, AI and VR/AR, I would like to show you why trends from years past are still relevant or even require the new ones in order to evolve.

What these three trends have in common is not only that they came about in the last four years, but also that they have their origins in the last century.

Our trends over the last hundred years

The oldest of the three technologies is virtual/augmented reality. The origins of this technology date back to 1932. Edwin Land, co-founder of the Polaroid Company, developed the polarised glasses. The polarising filters created at that time are still used today to make 3D films. This allows two images, taken from two different points and projected on top of each other, to be fed to both the right and left eyes. It wasn’t until 30 years later that Morton Heilig built the first passive VR machine: the ‘Sensorama’. It is likely that the term ‘virtual reality’ as it is used today first appeared in a novel in 1982 However, there was no uniform understanding of the term until it was added to the Oxford English Dictionary in 1987. The development of VR progressed steadily after that. This also popularised augmented reality, which had been virtually unknown until then. This brought AR more into focus. The ISMAR (International Symposium on Mixed and Augmented Reality) was launched in 2002 and is still held today. Both technologies reached a peak in 2017 and were made available to everyone.

Artificial intelligence has also had a similarly long journey to become what we know today. In 1936, mathematician Alan Turing proved that a computing machine is capable of executing cognitive processes as long as these processes can be broken down into individual steps and represented in an algorithm. The term artificial intelligence wasn’t coined until 20 years later by programmer John McCarthy at a conference at Dartmouth College. The world’s first AI program was written at that very conference. In 1966, the further development of artificial intelligence picked up speed thanks to the creation of the first chatbot ‘ELIZA’ by Joseph Weizenbaum. By 2018, AI could already keep up with two debate champions or arrange hairdresser appointments in a conversational tone – without anyone noticing that a machine was talking.

Out of all these technologies, the Internet of Things is the most recent. This idea was only able to grow at all thanks to the development of the Internet in 1989 by Tim Berners-Lee. The term was coined by British technology pioneer Kevin Ashton in 1999. Since then, the capabilities of IoT have developed steadily, from cloud technologies to the vision of a ubiquitous IoT landscape (2016).

The current relevance of IoT

While IoT was the trend of 2017 and 2018, AI and VR are now experiencing their boom in the business world. This does not mean that IoT has been pushed to the side and certainly cannot be described as ‘yesterday’s news’. However, studies have shown that the topic of IoT is only slowly gaining relevance in companies: The ‘Internet of Things’ study conducted by TÜV Süd in 2019 found that 56 per cent of all companies surveyed considered IoT to be particularly relevant. In the previous year, this figure was still 51 per cent. This shows that the topic is continuing to increase in relevance.

The reason for this is that IoT offers a number of different options and the issue of data security. The inexhaustible amounts of data that are generated and mostly stored in an unstructured way require capacity if they are to be processed. Many companies therefore see use cases primarily in quality control and in networked production, where data remains manageable.

Yet, the networking of things will be able to achieve its big breakthrough when used in combination with artificial intelligence or augmented reality.

How AI makes connected things smart

Collecting data in the realm of IoT helps make things smarter than they were before while networked things perform real-time and post-event analytics. Machine learning or deep learning can be used to detect patterns that makes using IoT products easier. AI makes it possible to process these patterns and analyses and to make use of them. For example, a smart home collects data via several sensors, such as via water consumption or heating usage. But it only becomes ‘smart’ with the help of intelligent applications that recognise the patterns in the data and regulate consumption based on them. Another example of how IoT and AI work together is the possibility of autonomous technologies. An autonomous car, or rather the AI it contains, for instance, is able to use sensors and the data it collects to assess when to apply the brakes or when a traffic light has changed to red in order to reproduce sensible driving behaviour.

Why connected things are making AR big

Everyone knows that ‘Pokemon Go’ has been the major AR trend of the last few years. However, the potential of AR is not limited to the use of apps. As manufacturing companies continue to be digitalised, they are using more and more IoT products. The networking of machines and operating sites enables service providers to provide international corporations with immediate assistance if problems occur. This allows the service person to connect to a smartphone on site and – with the help of AR – project notifications directly onto the screen when the problem area is in view. It also makes it easier to share expertise. For instance, if a skilled worker is out for a short period of time, other people can take over and perform each step as instructed through an AR application on a smartphone or smart glasses.

Conclusion

Companies shouldn’t be chasing a new big IT trend every year. This would not only be bad for constant development, but also for resources. Furthermore, it is important to keep an eye on how these trends are inter-related because combing these technologies in a goal-oriented way can achieve more than haphazardly using them individually. The fact that this requires the support of specialists does not indicate that a company has long since become outdated and has fallen behind, but only that attention is paid to the quality of implementation.

We’re the same – everything revolves around the latest trend every year, but since we have many different focuses, our strengths also lie in ‘past’ IT trends.

Picture Lisette  Korte

Author Lisette Korte

Lisette Korte works as an IT consultant in the area of digitalisation at adesso in Cologne. She likes to explore current trends in the area of the Internet of Things and her main focus is in requirements engineering.


Our blog posts at a glance

Our tech blog invites you to dive deep into the exciting dimensions of technology. Here we offer you insights not only into our vision and expertise, but also into the latest trends, developments and ideas shaping the tech world.

Our blog is your platform for inspiring stories, informative articles and practical insights. Whether you are a tech lover, an entrepreneur looking for innovative solutions or just curious - we have something for everyone.

To the blog posts

Save this page. Remove this page.