Back

Cognitive computing

Key takeaway: What is cognitive computing? Cognitive computing is aiming at the replication of the human based decision-making process. To this effect, it is collecting piles of data; generative AI is being used to develop and extend the collected data sets. As of now, the present range of applications are limited to: IIoT, self-driving cars, and applications that are equipped with smart devices. Companies engaged in this business will benefit from several positive medium to long-term secular trends, which we expect will help these companies to deliver above-average earnings growth for the coming decade.

 

 

 

 

Understanding cognitive computing?
Cogintive computingCognitive computing is an extension of 2nd generation artificial intelligence and it aims to answer questions, create content, draw images, and compose audio based on user performance. At best for now, cognitive computing is still in its very early stage, but the oncoming development will enhance its present embryonic state to meaningful applications and therefore some real economic value. Therefore, the development is expected to have a lasting impact on societal and technological evolution.  

The digital acceptance will lead to further fast technological developments, resulting in lower chip production costs which will support a large variety of different chip sets.  

In contrast with Metaverse, which relies mostly on augmented reality and which is addressing the needs of specific sectors such as media, entertainment, advertising, and apparel, we believe that cognitive computing will go numerous steps further and perform tasks independently without human intervention (apart the initial kick-off process) such as autonomous driven, personal assistances in the field of healthcare, and mobile target identification. Another area of cognitive computing explores opportunities in the field of physics and diagnose of diseases, where by extension, it will link up with quantum computing.

The new AI technology is largely due to the emergence of neuromorphic computing. This is in essence a chip-based technology that uses artificial neurons to mimic the functions and characteristics the brain function. The benefit of such process is the improved efficiency and processing speed and therefore, ultimately, it will lower production costs. Clearly, as we can see here, the trend of the traditional computer programming by writing codes will move towards coding the chip sets for a global purpose. Understanding this progression is key for investors and enterprises as it will shape the business performance for the decades to come.

In contrast to other technologies such as metaverse, cognitive computing will reach a persistence ratio of 99.999999% (similar to an adult human), with full synchronicity and interoperability. In the case of cognitive computing persistence relates to a fully fledge range of action and once it is launched it will operate without resets and pauses. Synchronicity refers to a full real-time experience – as events are processed locally, no network latency occurs, and finally full interoperability refers full 3rd party system acceptance. Given this, a commonly agreed protocol of data and content exchange needs to be put in place. Given this pre-requisite, it can be expected that any 3rd party exchange will occur through the blockchain technology. 

Today, the frontier of AI is somewhere beyond deep learning methods that use neural networks to artificially replicate brain functionalities. These kinds of processes are adept at pattern recognition, processing of natural languages, and they can handle complex communication up to a certain point. These were up to now exclusive activities of humans. 

Tomorrow, when cognitive computing will be integrated into end-users’ smartphones for instance, applications can learn and build knowledge and are expected to handle more or less complex situation. The critical point to understand is that neuromorphic based chips are expected to handle other ranges of artificial applications and making real-time decisions without the need to access complex data-tables, obvious instructions in code, or millions of prior examples to learn from.

Why do we believe that cognitive computing has a future?
Cognitive ComputingSomehow science fiction predicts our future and computing has followed the path of Hollywood with a reasonable time lag. Computing platform have gone from AS400 type mainframes to mini-computes, to client-server to PCs to smartphones. The evolution is driven, but at the same time limited, by three broad laws, such as a) Moore’s Law (the number of transistors on a chip set), b) Keck’s Law (wave density on fiber optic, leading to how quickly we can get content from the air to the app), and c) Metcalfe’s Law (the number of maximum possible user/appliances in a network).

Just as improvements continue to be pushed into the market, infrastructure and system limitations will be push an ever-greater use of 5G wireless technology and mobile edge computing power ever closer to the end-user. The most recent developments along Metaverse and ChatGPT were a clearly showcasing this process may come earlier than expected. 

Given the above considerations, we would not be surprised to see that “in person” generative chips set start to appear among digital natives and early adopters relatively quickly. If so, one could see a rapid development to reach industrial standards over the next few years. These chips will be an evolution of today’s system-on-a-chip (SoCs).

Addressable Market 

In one way or another, much of the spending related to cognitive computing will be an extension of existing R&D projects. Traditional CPUs/GPUs rely on external components such as RAM, storage, and other network components. These elements are strategically placed on the motherboard to reduce travel time when data is being sent forward and backward between the component and the CPU. In the case of SoCs, the chip is a single unit that has all components onboarded which will allow it to cater to the consumer’s demand at a much faster speed.

As for now, the most advanced system-on-a-chips require complex design and foundry mechanism and only few companies can excel in this advanced field. Typically, we can identify the following key companies: ASML Holding Inc, Broadcom Inc, Inter Corp, Maxim Integrated, MediaTek Inc, Microchip Technology Inc, NXP Semiconductors N.V., On Semiconductor, Qualcomm Incorporated, Samsung Electronics Ltd, STMicroelectronics N.V., Texas Instruments, Toshiba Corporation, amongst other smaller players. It is expected that the companies will share the addressable market of $320 billion by 2030 and the annual compound growth rate of around 8.1%. In this addressable market opportunity, side markets like 3D-sensing modules, cameras, and other essential sensors such as temperature, eye tracking systems, voice recognition, spatial orientation, and 6G or even 7G network capacities are not included. The interesting part here is that the cognitive computing is the very top layer of many facilities and therefore global opportunity to reach this level is more likely around to $750 billion

Opportunities ahead

Cogintive computingCognitive computing has a vast arena of applications, it ranges from industrial use to highly private arrangements. Cognitive systems are expected to bring exponential improvements in computing performance and this at a lower cost than quantum computing. In recent years, research has made significant progress creating neuromorphic chips that can process, store and retrieve content at the same time. Because of their nature, they can interact with a given situation in real-time and hence spot any unusual behavior or patterns that are out of the ordinary context.

While these chip-set sound somehow futuristic, cognitive computing is expected to be part of our day-to-day equipment of tomorrow. While capital requirement and computing power to build such chips will be significant, it is expected to be cheaper in the longer run than today’s approach where data scientists and machine learning experts train AI systems for every possible business case. 

Nonetheless, there are some concerns. In recent public discussions, at was argued by leaders in the filed of AI and by academics, amongst others, to pause any development so that the possible interactions of such system can be better understood while at the same time a risk analysis can be performed in various areas which include data security, privacy aspects, inbuilt human behaviors, amongst others. While we doubt that a time-out of few months will be sufficient to make an in-depth analysis, we clearly see the benefits of this technology moving forward.