Ideas for Leaders #387

Decision Support Systems: Under-rated and Under-used?

This is one of our free-to-access content pieces. To gain access to all Ideas for Leaders content please Log In Here or if you are not already a Subscriber then Subscribe Here.

Key Concept

Technology now provides a range of decision support systems to interrogate, process and analyse data on markets and customers and help companies answer ‘what-if’ questions. The best ones, however, could be being neglected by organizations. Recent empirical research finds a clear discrepancy between users’ perceptions of decision support systems and how these systems actually perform. 

Idea Summary

Technology acceptance research tells us that user evaluations (i.e. beliefs, perceptions and attitudes) strongly influence rates of adoption and diffusion. The problem for companies that develop and sell new technologies — and for those that would benefit from using and buying them — is that user evaluations aren’t always accurate or fair.

As many studies in psychology and other fields have shown, human perception and judgment can be clouded by biases. What someone says about a technological tool might not be a reliable indicator of how well it actually performs. People can be quick to blame ‘the system’ when things go wrong — and quick to take the credit when they go right. The difficulty of recognizing and/or isolating the exact contribution made by technology to improvements in performance adds to the problems.

The gap between the perceived usefulness of technology and its actual performance has recently been explored in empirical research on decision-making support systems (DSSs). IT-enabled tools designed to improve both the efficiency and quality of decision-making, DSSs are capable of ‘mining’ data to provide specific solutions and recommendations as well as compiling information for users and ‘distilling’ data into charts and graphics. ‘Applications’ include marketing and customer relationship management (CRM), problem-solving and the identification and management of risk.

Building on earlier studies, the researchers investigated the disconnect between user evaluations of DSSs and their actual performance in two controlled laboratory experiments. The first involved the use of three DSSs that are available commercially — mind mappers, guides to the creative process, and stimulus providers — to come up with creative ideas to solve a specific business problem. In the second, participants were asked to apply two more specialist DSSs to a potentially more complex problem — designing a creative marketing campaign for a specific brand.

The results add to an already bleak picture. The researchers did not find significant positive correlations between user evaluations and actual performance. Conversely, they found a link between improvements in actual performance and less favourable evaluations of a DSS.

Length and intensity of use appeared to drive user evaluations rather than actual performance — in the second study, users tended to give higher marks to the DSS when they worked with it over a longer period and more frequently.

The results imply that IT-enabled tools that are capable of adding real value are in danger of harmful neglect in organizations — and that they may be losing out to dysfunctional alternatives.

Business Application

Neglect of performance-enhancing DSSs is a potentially serious problem. How can it be solved?

The researchers point to two possible interventions:

  • Telling success stories — publicising (in-company) experiments or field studies that demonstrate the positive effects of DSSs on creative performance — and warning users that the benefits might only become evident after a period of extended use.
  • Using efficiency gains as a ‘bait’. Emphasizing tangible benefits such as reduced time and effort might stimulate use of DSSs that have less quantifiable benefits such as improved decision-making.

In addition, the researchers underline the importance of selecting DSSs that offer ‘intuitive’ interfaces, access to a wide range of analytical functions, well-designed graphics and fast response times. People tend to base their evaluations on aspects of systems they can easily observe. To be rated highly, a DSS has to be user-friendly.

Contact Us

Authors

Institutions

Source

Idea conceived

  • December 2012

Idea posted

  • June 2014

DOI number

10.13007/387

Subject

Real Time Analytics