Rotterdam school of Management, Erasmus University compact logo

Article: Monday, 24 March 2014

A serious disconnect between how users perceive decision-support systems and how these systems actually perform could lead to “harmful neglect” of such potentially performance-enhancing business tools. On the bright side, something can be done about it.

When you consider that accurate and timely information on key markets and customers is the lifeblood of decision makers, it is not surprising that these decision makers deploy a whole range of information technology tools – decision-support systems (or DSSs) – capable of trawling, processing and analysing huge amounts of business-related data in search of patterns, correlations and trends, as well as crucial answers to key what-if queries. All this information is used in preparing, refining and managing business plans and marketing campaigns that eventually have bearing on a company's performance and bottom line.

Since the late 1960s, DSSs have evolved in the field of marketing. So-called marketing decision-support systems (MDSSs) now include marketing models, expert systems, case-based reasoning systems, creativity support systems and marketing dashboards. In the fast-moving consumer-goods industry, companies deploy them to investigate, for example, how customers with different characteristics and requirements will react to a marketing campaign, discovering new sub-segments in the process. Banks and insurance companies make use of customer relationship management systems (CRMs) to identify possible customers for niche products or services.

Mind the gap

But are decision makers arming themselves properly with the best tools? Unfortunately, sixteen DSS studies we looked at revealed that these systems are not being deployed to their fullest potential. The same complaint applies to DSS in a broader sense. There is a gap between the actual and realised potential.

A key factor – “perceived usefulness” – in a tool's adoption and usage depends on the extent decision makers think it contributes to decisional quality, that is, performance. Simply put, if a DSS makes an important contribution to performance, but decision makers do not recognise this (because perceived usefulness is low), it will have a negative effect on the DSS's adoption and usage, and ultimately on the firm's results.

We conducted two experimental studies in the use and efficacy of DSSs. In the first study, the task was to come up with creative ideas in response to a business problem. We were thus able to measure actual performance by the quality (effectiveness) and number (productivity) of generated ideas.

The task for the second study was to design a creative marketing campaign, arguably a more complex task than the one in the first study. Again, the purpose was to examine whether users evaluate more effective DSSs more favourably. Here we compared the performance of two prototypical DSSs supporting the design of creative marketing campaigns, to a baseline condition in which no DSS was available.

The results of our empirical studies add to an already-bleak picture. Not only did we fail to find in either study significant positive correlations between user evaluations and actual performance; but, we even found significantly negative ones, meaning that improvements in actual performance were associated with less favourable evaluations of the DSS in question.

Crucially, our findings imply that if users were to follow their own perceptions, effective DSSs may not be adopted and used; or ineffective DSSs may be adopted and used. Apparently, recognition of the contribution of a DSS is not automatic. Our studies also show that user evaluations of performance-enhancing DSSs appear to be insufficient to guide their adoption and continued use.

In terms of further research, we feel it is important to study the conditions that facilitate or hinder users in forming accurate performance-related evaluations, which are essential to ensure good managerial decision-making. Hopefully, this will help to design effective interventions that facilitate the adoption and use of performance-enhancing DSSs in practice. However, what kind of interventions could be tried in the meantime to alleviate this problem?

Inspire, educate and support tomorrow’s leading entrepreneurs.

Accentuating the positive

Two potentially effective strategies come to mind. First, publicise success stories based on in-company experiments or field studies that demonstrate the positive effects of a DSS on creative performance. However, it is also important to warn users that such performance improvements may be difficult to assess immediately and that the contribution of a DSS may only become evident after a period of extended use.

And second, employ efficiency gains to get users interested. After all, it is generally easier to assess these types of gains (time saved, for example) than to assess improvements in decisional quality. We discovered that users tend to evaluate a DSS more positively when they were able to construct a solution more quickly after deploying it. Such efficiency gains could be emphasised to stimulate use, which may eventually enhance decision quality through (increased) DSS usage.

It should be noted that outside marketing, for example in operations management, DSSs are mainly deployed for efficiency gains. However, results in marketing are dependent on a huge number of different factors (such as marketing-mix, competitors and economic climate), making it difficult to isolate the contribution of the DSS to decision quality

Finally, there is also the “usability” of the tool to consider. It goes without saying that a DSS selected for deployment should fulfil several basic user requirements to make it attractive to use. Characteristically interactive, intuitive and user-friendly, it should also offer decision makers – sitting at their desks – access to a wide range of standard analytical functions and well-presented graphical information, as well as split-second response times.

Two potentially effective strategies come to mind

1. Publicise

First, publicise success stories based on in-company experiments or field studies that demonstrate the positive effects of a DSS on creative performance. However, it is also important to warn users that such performance improvements may be difficult to assess immediately and that the contribution of a DSS may only become evident after a period of extended use.

2. Efficiency

And second, employ efficiency gains to get users interested. After all, it is generally easier to assess these types of gains (time saved, for example) than to assess improvements in decisional quality. We discovered that users tend to evaluate a DSS more positively when they were able to construct a solution more quickly after deploying it. Such efficiency gains could be emphasised to stimulate use, which may eventually enhance decision quality through (increased) DSS usage.

It should be noted that outside marketing, for example in operations management, DSSs are mainly deployed for efficiency gains. However, results in marketing are dependent on a huge number of different factors (such as marketing-mix, competitors and economic climate), making it difficult to isolate the contribution of the DSS to decision quality

Finally, there is also the “usability” of the tool to consider. It goes without saying that a DSS selected for deployment should fulfil several basic user requirements to make it attractive to use. Characteristically interactive, intuitive and user-friendly, it should also offer decision makers – sitting at their desks – access to a wide range of standard analytical functions and well-presented graphical information, as well as split-second response times.

prof.dr.ir. B. (Berend) Wierenga
Professor Emeritus of Marketing
Rotterdam School of Management (RSM)
Erasmus University Rotterdam
Photo
Berend Wierenga
Pile of books with vibrant bookmarks protruding from various pages, symbolizing in-depth research.

Related articles

RSM Discovery

Want to elevate your business to the next level using the latest research? RSM Discovery is your online research platform. Read the latest insights from the best researchers in the field of business. You can also subscribe to the newsletter to receive a bimonthly highlight with the most popular articles.

Do you want to learn more about this subject?

Check out these RSM education programmes

Strategic Account Management
Strategic Account Management
  • 12 Mar 2025
  • 3 days
  • 4,000
Leading with Decision-Driven Analytics
Leading with Decision-Driven Analytics
  • 1 Apr 2025
  • 3 days
  • 3,500
Diploma Programme in General Management
Diploma Programme in General Management
  • 17 Mar 2025
  • 1 year
    • €13,100 -
    • €15,500 depending on modules chosen *
    • * see details on the page
MSc Marketing Management
MSc Marketing Management
  • 12 months
    • €2,530 (EEA)*|
    • €22,500 (Non-EEA)
    • *see details on page
Your contact for more information:
Danielle Baan

Science Communication and Media Officer

Portrait of Erika Harriford-McLaren
Erika Harriford-McLaren

Corporate Communications & PR Manager

Erasmus University campus in autumn, showcasing its iconic red trees, viewed from across the campus pool.