Early adopters of interconnected digital technologies in the industrial sector are realizing the benefits of improvements in operational efficiencies, productivity, safety, cost savings, profitability, and customer engagement and satisfaction. These technologies include those related to the IIoT movement such as AI, machine learning, big data, predictive analytics, machine-to-machine communication, and blockchain. Industries in the manufacturing, energy, utilities, automotive, and aviation sectors, to name a few, are capitalizing on investments in these revolutionary technologies. An upcoming issue of Cutter Business Technology Journal with Guest Editors Patrikakis Charalampos and Jose Barbosa will examine emerging trends and strategies in digital transformation in the industrial sector. How are IIoT, predictive analytics, AI, big data, blockchain, and other technologies being Read more
Posts Tagged 'AI'
We are currently in the midst of the Fintech revolution. It is evident that technology is significantly transforming and disrupting the financial services landscape. This presents significant opportunities and challenges for incumbents, start-ups and regulatory bodies. Fintech is evolving, moving beyond simply being the realm of start-up disruptors, with many incumbents now making significant R&D investments and many are collaborating with start-ups. Both interest and investment in Fintech has exploded in the last few years. Fintech, the intersection between finance and technology, has experienced a 67% growth in investment in the first quarter of 2016, with investment reaching $5.3 billion (Accenture, 2016). PwC (2016) report that 83% of financial companies believe that specific aspects of Read more
In the 1980s, everyone got excited about the possibility of artificial intelligence. The excitement grew for a few years and then gradually faded as companies found that it was too hard to build and maintain useful expert systems or natural language interfaces. However, there has been a renewed interest in developing software applications that can interact with people in natural languages, perform complex decision-making tasks, or assist human experts in complex analysis efforts. Today these systems are called cognitive computing systems or machine learning. They rely on research from artificial intelligence laboratories and use new techniques like deep learning and reinforcement which seem to overcome some of the problems that were encountered with earlier AI Read more
Cognitive computing is among the major trends in computing today and seems destined to change how business people think about the ways in which computers can be used in business environments. “Cognitive computing” is a vague term used in a myriad of ways. Given the confusion in the market as to the nature of cognitive computing, our recent Executive Report (Part I in a two-part series) describes what we mean by cognitive computing by exploring five different perspectives on the topic: (1) rules-based expert systems, (2) big data and data mining, (3) neural networks, (4) IBM’s Watson, and (5) Google’s AlphaGo. Here is a brief description of each. Rules-Based Expert Systems There have been other attempts to commercialize artificial intelligence Read more
Life complexifies. Perhaps it is a fundamental law of information that the complexity of information increases. In the world of biology, over time organisms become more complex, with new genetic permutations appearing alongside of old genetic pieces. In the hyperastronomical space in the animal genome, nature constantly produces new combinations. In human knowledge and scientific discovery, the same is true. New insights are built on top of old ones. Breakthroughs in insight usually have higher levels of complexity and hence require higher levels of abstraction and difficult codification to accommodate the widening domain covered. We all know E=MC2 but how many of us really know what it means? In the world of medicine, treatments are Read more
One cannot have superior science and inferior morals. The combination is unstable and self-destroying. — Arthur C. Clarke The late futurist and science fiction writer Arthur C. Clarke’s observation has long been a staple theme of science fiction stories, especially those involving smart machines and whether the algorithms used to make decisions would be for the benefit of humankind or its destruction. As artificial intelligence (AI) and robotics research has progressed along with growth in computing power, that programming question has steadily moved out of the realm of science fiction and into the computing technical community over the past decade. This has been especially true in the military establishment as the use of robotics rapidly Read more