With data being collected by organizations at a staggering rate, the demand for analytics to leverage the insight from this data is growing just as fast. Big data can viewed as a gateway to new opportunities, as a means of managing risks, or as a tool to improve business sustainability. Oftentimes, big data is associated with two keywords: analytics and technologies. These keywords represent an evolving suite of trends – from descriptive, predictive and prescriptive analytics to the application of Machine Learning, and Cloud technologies. Continuously monitoring these trends in analytical approaches and technological breakthroughs in the context of Big Data and applying them to produce business value is the key to survival in this Read more
Posts Tagged 'machine learning'
Early adopters of interconnected digital technologies in the industrial sector are realizing the benefits of improvements in operational efficiencies, productivity, safety, cost savings, profitability, and customer engagement and satisfaction. These technologies include those related to the IIoT movement such as AI, machine learning, big data, predictive analytics, machine-to-machine communication, and blockchain. Industries in the manufacturing, energy, utilities, automotive, and aviation sectors, to name a few, are capitalizing on investments in these revolutionary technologies. An upcoming issue of Cutter Business Technology Journal with Guest Editors Patrikakis Charalampos and Jose Barbosa will examine emerging trends and strategies in digital transformation in the industrial sector. How are IIoT, predictive analytics, AI, big data, blockchain, and other technologies being Read more
In the 1980s, everyone got excited about the possibility of artificial intelligence. The excitement grew for a few years and then gradually faded as companies found that it was too hard to build and maintain useful expert systems or natural language interfaces. However, there has been a renewed interest in developing software applications that can interact with people in natural languages, perform complex decision-making tasks, or assist human experts in complex analysis efforts. Today these systems are called cognitive computing systems or machine learning. They rely on research from artificial intelligence laboratories and use new techniques like deep learning and reinforcement which seem to overcome some of the problems that were encountered with earlier AI Read more
Cognitive computing is among the major trends in computing today and seems destined to change how business people think about the ways in which computers can be used in business environments. “Cognitive computing” is a vague term used in a myriad of ways. Given the confusion in the market as to the nature of cognitive computing, our recent Executive Report (Part I in a two-part series) describes what we mean by cognitive computing by exploring five different perspectives on the topic: (1) rules-based expert systems, (2) big data and data mining, (3) neural networks, (4) IBM’s Watson, and (5) Google’s AlphaGo. Here is a brief description of each. Rules-Based Expert Systems There have been other attempts to commercialize artificial intelligence Read more