May 032016
 
EA's Role in the Innovation Management Process

By asking the CEOs of some of the most successful and influential companies in the world, such as GE and Google, a clear definition of innovation manage­ment emerges. The definition addresses the need to quickly and effectively implement organizational goals and objectives to remain competitive and the desire to strengthen advantages through the adoption of innovative ideas, products, processes, and business models.[1] Enterprises facing increasing competition and the pressure of techno­logical innovation are beginning to realize that to drive organic business growth and maintain a competitive advantage, they need to discover and imple­ment innovation quickly and with great care to ensure maximum value. One-off innovations are moderately easy to take advantage of, but to create a pipeline of Read more

May 032016
 
Five Perspectives on Cognitive Computing

Cognitive computing is among the major trends in computing today and seems destined to change how business people think about the ways in which computers can be used in business environments. “Cognitive computing” is a vague term used in a myriad of ways. Given the confusion in the market as to the nature of cognitive computing, our recent Executive Report (Part I in a two-part series) describes what we mean by cognitive computing by exploring five different perspectives on the topic: (1) rules-based expert systems, (2) big data and data mining, (3) neural networks, (4) IBM’s Watson, and (5) Google’s AlphaGo. Here is a brief description of each. Rules-Based Expert Systems There have been other attempts to commercialize artificial intelligence Read more

Apr 222016
 
Call for Papers: Cyber Threats in the Era of the Internet of Everything

This upcoming issue of Cutter IT Journal seeks articles on new approaches, strategies, and solutions to help IT professionals address and prevent the possibility of cyber attacks stemming from IoT related devices. Cyber threats have been on the rise, and more so with the advent of the Internet of Everything (IoE). Common appliances are now featuring intelligent processing and real-time connections to the Internet. Health measurements are now collected in real-time by smart wearables, including general purpose smart watches. The latest models from automobile manufacturers feature cloud connectivity for enabling remote software updates, tracking fuel consumption, and streaming dashcam activity. On a larger scale, the smart grid ensures seamless and dynamic allocation of energy where Read more

Apr 192016
 
Preparing for the Disruption of the IoT

The Internet of Things (IoT) promises to cause disruption in almost every industry. Companies need to examine how they can take advantage of connected products and services and plan for the significantly increased data workloads that will likely come with the deployment of sensor-enabled products. However, an expected surge in product innovation also means that companies should carefully consider how they will deal with the potential rise of new, more agile competitors whose business models will be based primarily on IoT products and services. Here are some points about the IoT I’ve been discussing with colleagues that organizations may want to consider. Bigger, Faster, Varied Data and New Data Management Practices The expected myriad of Read more

Apr 192016
 
Six Ways to Reduce Tech Debt

What strategies do you apply to modernizing a product code base? What results do you get with those strategies? This Advisor takes a retrospective look at a past project, both to describe the strategies my colleagues and I used to rearchitect the product and to validate the effectiveness of those strategies with two technical debt assessments via Cutter’s Technical Debt Assessment and Valuation practice. The six strategies we used are presented here. The two assessments are used to evaluate the measured impact on the system from the team’s efforts and compare it to the actual time spent modernizing the code. This is the story of the DeLorean system, a client’s longtime production setup. (While not Read more

Apr 142016
 
Don't Let Agile Become Bad Science

The latest enthusiasm for hypothesis-testing in the Agile community is a good thing…Until it turns bad. If we’re not careful how we do hypothesis testing, that’s exactly what could happen. Hypothesis testing means applying the scientific method, which involves doing something really, really hard: putting our cherished beliefs to the test, not to prove them, but to disprove them. Any fool can come up with “evidence” to support a hypothesis. Why do I think that matching socks keep disappearing after I do the laundry? Demons steal them. How do I know? If I’m really committed to this explanation, I’ll find some way to support this novel viewpoint. Without this core commitment to testing to disprove, we Read more

Apr 132016
 
Bob Charette Wins Esteemed Neal Award for Editorial Excellence

Shout out to Cutter Fellow Bob Charette for winning the esteemed 2016 Jesse H. Neal Award for his series of articles featured in the “Lessons from a Decade of IT Failures” – detailing the takeaways from tracking the big IT debacles of the last ten years. After Bob’s 2005 article, “Why Software Fails” – he started tracking, documenting and blogging about technology failures of all sizes. Ten years later, he selected five of these IT project failures to feature in the report, “Lessons from a Decade of IT Failures,” replete with interactive graphs and charts – now recognized and rewarded for its editorial excellence in business media. A very well deserved honor — congratulations, Bob! Read more