Artificial intelligence (AI) is no longer just the stuff of science fiction movies; it’s real and transforming the way we live and do business. AI encompasses several technologies including machine learning (ML), deep learning (DL), and chatbots. AI is currently being used in domains such as finance, manufacturing, healthcare, automotive, entertainment, security, and in cyber-physical systems, to name a few. As well, large investments are being made in research, development and marketing of AI products, tools and services. There is increasing awareness of AI and its promises, limitations and concerns. To gain dominance in the AI landscape, there is fierce competition among technology giants as well as startups. Ongoing developments in AI and the resulting Read more
Posts Tagged 'AI'
A specialized piece of artificial intelligence, machine learning, allows computers to learn on their own. In “Why Machine Learning is Crucial to Effective Utilization of Big Data,” Cutter Consortium Senior Consultant Greg Smith and his co-authors explain that machine learning is a specialization within artificial intelligence (AI) that allows computers to learn on their own. Computers are provided a set of rules, rather than explicit programming instructions on what to do, and can then self-train and develop a solution by themselves. With ML techniques, computers iteratively learn from data and also find potential hidden insights in data. Why is it that machine learning can offer a huge advantage over human intellect? According to Smith, ML does not rely on human Read more
Early adopters of interconnected digital technologies in the industrial sector are realizing the benefits of improvements in operational efficiencies, productivity, safety, cost savings, profitability, and customer engagement and satisfaction. These technologies include those related to the IIoT movement such as AI, machine learning, big data, predictive analytics, machine-to-machine communication, and blockchain. Industries in the manufacturing, energy, utilities, automotive, and aviation sectors, to name a few, are capitalizing on investments in these revolutionary technologies. An upcoming issue of Cutter Business Technology Journal with Guest Editors Patrikakis Charalampos and Jose Barbosa will examine emerging trends and strategies in digital transformation in the industrial sector. How are IIoT, predictive analytics, AI, big data, blockchain, and other technologies being Read more
We are currently in the midst of the Fintech revolution. It is evident that technology is significantly transforming and disrupting the financial services landscape. This presents significant opportunities and challenges for incumbents, start-ups and regulatory bodies. Fintech is evolving, moving beyond simply being the realm of start-up disruptors, with many incumbents now making significant R&D investments and many are collaborating with start-ups. Both interest and investment in Fintech has exploded in the last few years. Fintech, the intersection between finance and technology, has experienced a 67% growth in investment in the first quarter of 2016, with investment reaching $5.3 billion (Accenture, 2016). PwC (2016) report that 83% of financial companies believe that specific aspects of Read more
In the 1980s, everyone got excited about the possibility of artificial intelligence. The excitement grew for a few years and then gradually faded as companies found that it was too hard to build and maintain useful expert systems or natural language interfaces. However, there has been a renewed interest in developing software applications that can interact with people in natural languages, perform complex decision-making tasks, or assist human experts in complex analysis efforts. Today these systems are called cognitive computing systems or machine learning. They rely on research from artificial intelligence laboratories and use new techniques like deep learning and reinforcement which seem to overcome some of the problems that were encountered with earlier AI Read more
Cognitive computing is among the major trends in computing today and seems destined to change how business people think about the ways in which computers can be used in business environments. “Cognitive computing” is a vague term used in a myriad of ways. Given the confusion in the market as to the nature of cognitive computing, our recent Executive Report (Part I in a two-part series) describes what we mean by cognitive computing by exploring five different perspectives on the topic: (1) rules-based expert systems, (2) big data and data mining, (3) neural networks, (4) IBM’s Watson, and (5) Google’s AlphaGo. Here is a brief description of each. Rules-Based Expert Systems There have been other attempts to commercialize artificial intelligence Read more
Life complexifies. Perhaps it is a fundamental law of information that the complexity of information increases. In the world of biology, over time organisms become more complex, with new genetic permutations appearing alongside of old genetic pieces. In the hyperastronomical space in the animal genome, nature constantly produces new combinations. In human knowledge and scientific discovery, the same is true. New insights are built on top of old ones. Breakthroughs in insight usually have higher levels of complexity and hence require higher levels of abstraction and difficult codification to accommodate the widening domain covered. We all know E=MC2 but how many of us really know what it means? In the world of medicine, treatments are Read more