Artificial intelligence (AI) is no longer just the stuff of science fiction movies; it’s real and transforming the way we live and do business. AI encompasses several technologies including machine learning (ML), deep learning (DL), and chatbots. AI is currently being used in domains such as finance, manufacturing, healthcare, automotive, entertainment, security, and in cyber-physical systems, to name a few. As well, large investments are being made in research, development and marketing of AI products, tools and services. There is increasing awareness of AI and its promises, limitations and concerns. To gain dominance in the AI landscape, there is fierce competition among technology giants as well as startups. Ongoing developments in AI and the resulting Read more
Posts Tagged 'machine learning'
A specialized piece of artificial intelligence, machine learning, allows computers to learn on their own. In “Why Machine Learning is Crucial to Effective Utilization of Big Data,” Cutter Consortium Senior Consultant Greg Smith and his co-authors explain that machine learning is a specialization within artificial intelligence (AI) that allows computers to learn on their own. Computers are provided a set of rules, rather than explicit programming instructions on what to do, and can then self-train and develop a solution by themselves. With ML techniques, computers iteratively learn from data and also find potential hidden insights in data. Why is it that machine learning can offer a huge advantage over human intellect? According to Smith, ML does not rely on human Read more
Recently, Cutter Fellow Steve Andriole took a wide look at insurtech. His insight, and a collection of articles on the topic, appear in a recent issue of Cutter Business Technology Journal (CBTJ). What exactly is insurtech? According to Investopedia, insurtech refers to: the use of technology innovations designed to squeeze out savings and efficiency from the current insurance industry model.… The belief driving insurtech companies is that the insurance industry is ripe for innovation and disruption.” Steve Andriole has a unique perspective on insurtech having served as CTO and Senior VP for Technology Strategy at CIGNA Corporation, a $20 billion global insurance and financial services company. Writes Steve: The insurance industry is very quickly evolving, Read more
With data being collected by organizations at a staggering rate, the demand for analytics to leverage the insight from this data is growing just as fast. Big data can viewed as a gateway to new opportunities, as a means of managing risks, or as a tool to improve business sustainability. Oftentimes, big data is associated with two keywords: analytics and technologies. These keywords represent an evolving suite of trends – from descriptive, predictive and prescriptive analytics to the application of Machine Learning, and Cloud technologies. Continuously monitoring these trends in analytical approaches and technological breakthroughs in the context of Big Data and applying them to produce business value is the key to survival in this Read more
Early adopters of interconnected digital technologies in the industrial sector are realizing the benefits of improvements in operational efficiencies, productivity, safety, cost savings, profitability, and customer engagement and satisfaction. These technologies include those related to the IIoT movement such as AI, machine learning, big data, predictive analytics, machine-to-machine communication, and blockchain. Industries in the manufacturing, energy, utilities, automotive, and aviation sectors, to name a few, are capitalizing on investments in these revolutionary technologies. An upcoming issue of Cutter Business Technology Journal with Guest Editors Patrikakis Charalampos and Jose Barbosa will examine emerging trends and strategies in digital transformation in the industrial sector. How are IIoT, predictive analytics, AI, big data, blockchain, and other technologies being Read more
In the 1980s, everyone got excited about the possibility of artificial intelligence. The excitement grew for a few years and then gradually faded as companies found that it was too hard to build and maintain useful expert systems or natural language interfaces. However, there has been a renewed interest in developing software applications that can interact with people in natural languages, perform complex decision-making tasks, or assist human experts in complex analysis efforts. Today these systems are called cognitive computing systems or machine learning. They rely on research from artificial intelligence laboratories and use new techniques like deep learning and reinforcement which seem to overcome some of the problems that were encountered with earlier AI Read more
Cognitive computing is among the major trends in computing today and seems destined to change how business people think about the ways in which computers can be used in business environments. “Cognitive computing” is a vague term used in a myriad of ways. Given the confusion in the market as to the nature of cognitive computing, our recent Executive Report (Part I in a two-part series) describes what we mean by cognitive computing by exploring five different perspectives on the topic: (1) rules-based expert systems, (2) big data and data mining, (3) neural networks, (4) IBM’s Watson, and (5) Google’s AlphaGo. Here is a brief description of each. Rules-Based Expert Systems There have been other attempts to commercialize artificial intelligence Read more