Business

How to Deliver Internal Business Intelligence Services?

How Different Strategies Allow Companies to Succeed by Failing Fast

Business intelligence and analytics (BI&A) are a category of computing
technologies and supporting processes facilitating data collection, storage, access, and analysis to improve managerial decision-making (Chaudhuri, Dayal, & Narasayya, 2011; Watson, 2009). Business intelligence generally refers to technologies and processes which provide descriptive understanding of data sets, from non- interactive reports to dynamic dashboards capable of drill-down activities. Front-end tools such as Tableau, Qlik, Cognos, and others provide multiple configuration options to create stunning, interactive data visualizations. At the back-end, high-performance databases with support for relational, object, and non-relational data capable of storing and accessing traditional structured data as well as unstructured multiformat data are critical to BI activities. BI product options abound, but Microsoft SQL Server, Oracle Enterprise Server, IBM DB2, SAP ASE, and MySQL are leading choices.

Business analytics (also commonly known as data analytics) is generally used to describe higher-level statistical analysis of large data sets. While data analysis is necessary to the development of a story conveyed via BI reports, the value of data analytics lies in its predictive and prescriptive nature. Predictive analytics uses sophisticated data modeling to make forecasts, and prescriptive analytics makes specific recommendations on an optimal course of action (Davenport, 2013).

In addition, business analytics generally assumes self-service activities, in which the analysis is performed by business users who are also specialists in data analysis techniques. Business analytics is closer to process than to product, but many (if not all) of the products listed above support analytics processes adequately.

BI&A success stories have more to do with the company’s innovative use of existing tools than with the capabilities of the tools themselves. The opportunities are endless: the Burberry Group personalizes in-store customer shopping experiences with the use of RFID tags and targeted videos on display screens; CVS Health matches call center agents with the types of customers agents are more likely to engage positively; L’Oreal mines social media posts and routes appropriate ones to employees who can engage with the post’s author; Lockheed Martin mines project data for early indicators of trouble; and Petroleos Mexicanos uses sensors to monitor equipment noise to proactively schedule maintenance and reduce downtime (Laskowski, 2015). For simplicity, the term “BI” will be used here to refer to any and all technologies and processes described here as included in BI&A.

This article provides a discussion of the three most commonly adopted approaches to the provision of internal BI services by organizations: on-premises, on the cloud, and a hybrid approach of the two. A description of some common benefits and limitations of these three common strategies on the spectrum of BI service provisioning accompanies each description.

An on-premise strategy involves the use of the firm’s own human and technology assets to complete all business analytics activities resulting in any BI product. At the complete opposite end of the spectrum, a pure cloud strategy requires the use of external, network-accessible vendors, commonly referred to as cloud providers for all BI activities.

The description of this strategy will also be accompanied by a discussion of its defining characteristics, primary service models, and common private deployment modalities. Finally, in a hybrid strategy, companies offer BI services using a combination of its own existing infrastructure and cloud providers.

Business Intelligence Systems and Analytics

What constitutes a product or service in the business intelligence and analytics space is still relatively open to debate and can include anything from static reports (printed or electronic) to mobile, customizable/configurable, and on-demand data visualizations with real-time data and automated insight discovery. The simplest BI&A tools provide a summary view of historical data, often without the ability to request additional context for the data.

Farther along the spectrum of complexity, Apoteket, a state-owned Swedish pharmacy retailer, uses dashboard applications to analyze marketing data to identify effective store promotions, track the performance of products via online orders, and coordinate supply chains with pharmaceutical manufacturers and other retail partners (Tableau.com). The technologies which make this possible are, generally, familiar technologies: large, high-performance databases, telecommunications networks, mobile devices, and user-friendly interfaces. The business processes they support are also very familiar: identify profitable customers, products, services, and market opportunities, reduce costs, and increase customer satisfaction. The combination of increasingly powerful hardware/software platforms and business insight is what drives the innovation we see in BI&A.

Business intelligence and analytics have been described as both process and product: the results of processes and activities firms use to identify and extract useful information become the product which allows them to compete more successfully (Jourdan, Rainer, & Marshall, 2008; Vedder, Vanecek, Guynes, & Cappel, 1999). The process itself becomes the product as it generates insight. As processes mature and become proven industry bench marks, they are embedded in new software products as standard capabilities. The process becomes replicable and, eventually, generic (Carr, 2003). BI&A technologies and processes only provide as much competitive advantage as the firm can extract from data.

Maintaining lasting competitive advantage becomes difficult because the creation of in-house applications specific to the firm’s needs is expensive and slow, and off-the-shelf products are one size fits all. Firms whose BI processes or technologies are unique (likelier with on- premise strategy) may find longer-lasting positions of competitive advantage (Barney, 1991). However, cloud platforms are ideal for fast innovation cycles because service elasticity dramatically lowers the risk of provisioning a service incorrectly (Armbrust et al., 2009). As such, a unified framework for identifying a single corporate strategy for provisioning internal BI services either in-house, on the cloud, or in a combination of both is unlikely to emerge soon.

While many academic and industry researchers do not bother to differentiate between business intelligence and analytics (Chen, Chiang, & Storey, 2012; Gartner, 2013), others describe analytics as extending beyond large-scale data storage, mining, and colorful interactive visualizations to develop new products and services based on deep analysis of all the data companies maintain (Davenport, 2006, 2013).

The notion of business intelligence has been around since at least the late 1950s (Luhn, 1958), but it was not popular with the business and IT communities until the late 1990s (Chen et al. 2012). At that point in time, business intelligence included all processes and technologies in support of better decision-making. In the late 2000s, the added & analytics descriptor began to make its way into research and industry literature (Davenport, 2006). In the 2010s, the two terms have started defining separate but related ideas, and while there is still not a consensus, uniform definition of each, differences are starting to become clear.

Chen et al. (2012), in their introduction to a special issue of the MIS Quarterly, the premier MIS academic research outlet (Lowry et al., 2013), synthesize a number of academic and industry articles and further classify BI&A into generations given the ever popular 1.0, 2.0, and 3.0 labels. According to the authors, BI&A 1.0 includes most current technologies, procedures, and uses, which rely on the collection of structured data via transactional systems (often legacy) and store it in traditional relational database environments. Data analysis is based on techniques developed in the 1970s and 1980s, consisting largely of intuitive, simple visualizations. BI&A 1.0 examples include practically any static data report or visualization but can extend to more current practices with dashboard applications. From spreadsheets to most OLAP activities, BI&A 1.0 captures what most companies are currently doing with their data.

BI&A 2.0 technologies and activities add text mining, web analytics, and social network analysis to the mix of data analysis activities discussed in BI&A 1.0. Perhaps the most recognizable label associated with BI&A 2.0 is Big data. Big data has three defining characteristics: volume, velocity, and variety (McAfee & Brynjolfsson, 2012).

The volume of data being created by today’s business environment is truly spectacular. And it is being created at an astonishing rate of speed (velocity). It is one thing to generate one petabyte of data (one million gigabytes, or 1015 bytes), but it is quite another to generate it in a single day. It is estimated that in 2016 Netflix users watched around 8.3 petabytes of video per hour and that total consumption of data in the United States topped 2.5 petabytes per minute (James, 2016). The variety of big data is reflected in the multitude of sizes, formats, and contents of the data being created and includes both structured and unstructured data. Structured data is most easily described as the kind of predictable data associated with online purchases: customer name, shipping address, credit card information, etc. It is predictable because all data fields are easy to categorize, have a range of anticipated values and well-known formats, and are subject to maximum sizes which do not sacrifice content or meaning. Building a database to accommodate this data is a simpler task (as difficult as building a good database is).

Unstructured data is normally defined to include multimedia, graphics, e-mail, and a large number of social media content formats (Tallon, Short, & Harkins, 2013) which is much less suited to simple numerical analysis or categorization (Devlin, 2010). To illustrate that last point, the reader is invited to consider what the average of his/her last 20 social media posts would be (the question itself makes little sense and invites more questions in order to define it better). Building data repositories for unstructured data is a more complex task because it is much more difficult to predict its content and to categorize it. Extracting meaning from data with such varied origins, content, and format is a formidable challenge.

Lastly, BI&A 3.0 is the label applied to data products with analytics embedded, which provide and create data simultaneously. These technologies and activities support and enable location-aware, mass-customized, context-sensitive, mobile analysis and product/service delivery. For example, location-relevant advertisements and discounts in mobile traffic apps such as Waze, which can alert you to the presence of establishments or offers in your vicinity (data delivery) and record and report whether you take advantage of their offers (data creation). Using Yelp, consumers can check into restaurants and receive special discounts on the spot (data delivery and creation) and request a review of their experience later on (data creation).

Other offers which consume and create data are tasks in apps like LinkedIn via in-network endorsements or recommendations, or the ability to modify Facebook ad campaigns in real time (as an advertiser) and to request that similar ads not be shown to you (as consumer) are all BI&A 3.0 services. Davenport (2013) calls them simply Analytics 3.0.

It is also worth noting that there is overlap and bleed at the edges of BI&A 1.0, 2.0, and 3.0 services, and a clear-cut border is often difficult to find. One is also likely to see a blurring of the lines between the terms business intelligence and analytics in current environments. In general, however, an on-prem approach is adequate for BI 1.0 activities, a hybrid approach can complement on-prem strategies to provide BI 2.0 services quite well, and the speed of application development in a cloud strategy is better suited for the intensive platform demands of BI 3.0 applications and products.

The Society for Information Management’s (SIM) long-running (since the 1980s) IT Issues and Trends study series shows BI has remained at or near the top of technology investment by surveyed organizations in the United States for over a decade (Luftman, Kempaiah, & Nash, 2005, many others) and in recent surveys across several other geographical regions as well (Luftman et al., 2013). Similarly, IT and business alignment has been on the list of top five management concerns for an even longer period of time (Kappelman et al., 2015; Luftman & McLean, 2004; many others).

In the past few years, the SIM survey has measured cloud computing and its many associated manifestations (Xaas, SOA, etc.) separately, and they have been shown to be among the areas receiving the highest levels of investment by organizations (Kappelman et al., 2015; Luftman & Derksen, 2012). Combined, all these results point toward the importance of aligning BI service capabilities with business objectives, especially when cloud computing activities are receiving such high level of attention and investment.

BI Service Architecture

A discussion of a general architecture to support the delivery of internal BI services through various modalities follows. By necessity, BI architectures will vary based on business need, budget, and technology capability (Shariat & Hightower, 2007; Turban, Sharda, Aronson, & King, 2008; Watson, 2009; many others) and will contain differing combinations of data sources, storage, and reporting technologies (Ong, Siew, & Wong, 2011). Figure 10.1 illustrates a generic BI services architecture suitable for the discussion of provisioning of internal BI services in this chapter.

Starting from the left side of the diagram, Fig. 10.1 shows the two possible sources of data (internal and external) and some of the various combinations of systems and partners from which data can originate. Data may be generated internally by transactional systems which collect data as customers order products or services from companies (TPS), as products move through a supply chain (SCM, ERP), as customer data is collected by sales staff (CRM), as raw materials are received and processed, and as products are manufactured, shipped out, etc. (op sys).

Data may also be generated externally as a firm purchases marketing data sets, tracks product movement through the supply chain, advertises on social media outlets, monitors its online presence, engages with customers via social media, etc. As the data is generated internally or received from external sources, the firm may store it temporarily in an operational data store (not much more than a temporary database) until any preliminary data formatting, validation, and other activities are performed or until a suitable time after which the data may be moved elsewhere for preparation. This could be the end of a transaction, close of business day, the end of an online campaign, or once a predetermined amount of data has been collected.

The operational data store shown is a proxy for various possible collection points. Though Fig. 10.1 does not explicitly indicate it, this data may be analyzed for near real-time trends and insights, but companies must be careful about interpreting results at this stage of analysis because data may be incomplete, redundant, or inaccurate and produce flawed insights (van der Lans, 2009a, 2009b, 2009c).

In preparation for eventual storage in an enterprise data warehouse, data may be moved to a data staging location (to make space for new data coming into the operational data store). With the rise of big data, this data staging location may also be known as a data lake and be used to combine structured data typically stored in relational environments with unstructured data coming from external environments.

These environments will most frequently be implemented using well-known big data frameworks, technologies, and tools such as NoSQL databases, Hadoop, HDFS, and many others. The data in these staging areas or lakes can undergo analysis or additional and extensive filtering, validation, reformatting, and selection, collectively known as extract-transform-load (ETL; Watson, 2009).

An enterprise data warehouse (EDW) is a long-term storage and access environment for the data. Transactional systems generate and collect the data to be stored in the EDW and are dynamic and volatile, i.e., data changes very quickly. An EDW, by contrast, is static and batch-oriented, and ETL activities are critical to bridging the gap between the low complexity of the data processing in operational data stores and the very high complexity and accuracy required of long-term EDW storage to support BI services (DeLua, 2008).

EDW contents are often categorized into data marts, collections of data specifically suited for targeted use (Chen et al. 2012). For example, some of the data contents of an EDW may be flagged as being of particular interest to the marketing or accounting functions, and some of this data may be of interest to both. This can be done with or without the need for data replication, and any given piece of data may be part of multiple data marts.

Online analytical processing (OLAP; Chen et al. 2012) tools and activities feed data directly to users or to additional BI tools such as dashboards, predictive modeling and data mining engines, etc. Big data mechanisms (such as MapReduce) can also provide users with data stored in distributed environments (like Hadoop) for direct analysis.

This simple architecture can be made more complex by the addition of technologies and processes designed to help the organization define data access and system configuration globally (master data management), specify, standardize, and codify the meaning of any data structure (metadata), facilitate the exchange of data between systems on demand (enterprise data and messaging buses), engage software services within and outside the company (web services, SOA), etc. The basic structure presented in Fig. 10.1 is sufficient to guide the discussion in this chapter.

BI Service Provisioning Strategies

This section discusses three strategies for the provisioning of internal BI services, beginning with the full ownership of all assets needed by the company itself, using its own technology infrastructure, followed by the opposite end of the spectrum, a fully cloud-based approach. A discussion of a hybrid strategy closes out the section.

On-Prem Strategy

The provisioning of a firm’s internal BI services using its own human and technology assets is commonly known as on-premises, for obvious reasons. The term is hereafter shortened to on-prem for convenience and to reflect common industry practice. A pure on-prem strategy requires that a company own all human and technology assets required to create, collect, store, and analyze data, on both the client and on the server side, including all network assets in physical locations where the firm’s employees are permanently located.

In anticipation of the discussion of a hybrid provisioning strategy, it is important to note that we still define remote access to on-prem BI assets as on-prem, because the BI assets themselves are the firm’s property. Among the benefits of a pure on-prem strategy are the development of expertise with the planning, implementation, operation, and replacement of traditional IT infrastructures, the ability to completely specify all security controls and environments for all BI services, and greater business agility/ service speed. These benefits, and how they are accrued, are discussed below.

First, the ability to offer BI services on-prem allows an organization to develop in-house expertise with all aspects of information systems service delivery, from implementation to operation and eventual replacement, enhancing the firm’s ability to recognize the value of technical innovations, an idea labeled absorptive capacity (Cohen & Levinthal, 1990). Zahra and George (2002) extended the concept to differentiate between potential and realized capacity and defined them as the acquisition and assimilation of new information or innovations and the transformation and exploitation of either or both.

Additional research argues that competency with BI systems increases the firm’s ability to acquire and transform additional innovations and to assimilate BI technologies and services (Yeoh, Richards, & Wang, 2013). Empirical work by ElBashir, Collier, and Sutton (2011) shows that, while top management support is a strong moderator of the deployment of BI technologies and services, the power to assimilate and transform BI capabilities comes “from the bottom-up” (p. 180), that is, from the managers working directly with these services and who are closer to the operations of the firm. On-prem delivery of BI services can be a tremendous advantage to the corporation.

Second, security and privacy and their associated controls have been top-level strategic concerns for management for more than 10 years (Luftman & McLean, 2004; Kappelman et al., 2015; many others). These concerns will remain at the forefront of thought for strategic planners as IT service provisioning priorities shift from tactical and operational concerns to strategic and value-generating areas (Kappelman et al., 2015).

The ability to completely regulate the security controls environment for BI services is a very attractive benefit of on-prem provisioning. This is an increasingly more difficult assignment for any organization, and total control over data security and privacy is very much desirable.

Lastly, the annual SIM industry survey previously cited has also identified business agility as a top management concern for well over a decade. Business agility has been defined variously as the ability to react to changing market conditions more effectively and efficiently and to identify prospects for innovation, maintain superior customer satisfaction, and gather expertise, technology, partnerships, and other assets to act on those opportunities with speed and surprise (D’Aveni, 1994; Goldman, Nagel, & Preiss, 1995). Wixom, Yen, and Relich (2013) expand that definition by including the speed at which an organization can turn raw data into useful information, an ability they term “speed to insight” (Wixom et al., 2013). A firm’s expertise with on-prem BI services can contribute to business agility by enabling truly customized service development with in-house expertise.

Cloud Strategy

The second BI strategy discussed here is the complete provisioning of all BI services via external, network-accessible vendors, commonly referred to as cloud providers. Fittingly, this approach is referred to here as, simply, cloud. Before a description of what cloud services are and could be, it is worth noting that cloud services are defined by both providers and customers as their needs require it, and no single framework theoretical to explain their adoption, diffusion, or assimilation yet exists.

The reasons for the corporate adoption of any cloud service are as varied as the services themselves. Whether any particular company derives any benefit from any given cloud service is so heavily dependent on context, as to make a universal framework of analysis a daunting task. This section provides a brief history of the development of cloud services and descriptions of their defining characteristics, service offer models, and customer deployment models and closes with a brief discussion of some of their benefits and limits.

The provisioning of computing services over network connections by third-party providers has a checkered past. Companies looking for cost savings began to outsource computing and software services in the early 1990s, and as network bandwidth began to increase and costs decreased, access to those services over network connections became much more attractive options. The first generation of this type of service involved access to commercial off-the-shelf software on a subscription or on-demand basis, and vendors became known as application service providers (ASPs; Lee, Kim, & Kim, 2007).

The initial success of ASPs was propelled by a desire for firms to concentrate on their core competencies, reduced IT costs, and a shortage of skilled workers (Lee et al. 2007). The collapse of stock market prices for Internet pure plays in the late 1990s led to the failure or consolidation of most “dot- coms” and to the restructuring of Internet-based business models. After that period of time, the ASP acronym fell out of favor due to its association with the dot-com bust. The outsourcing of software services model, however, survived and reemerged with several variants and names, such as Software as a Service (Saas) and the preferred label of cloud computing we use today, which has now grown to include SaaS.  Cloud computing has been formally defined by the US Department of Commerce’s National Institute of Standards and Technology (NIST) as:

enabling ubiquitous, convenient, on-demand network access to a shared pool of configurable computing resources (e.g., networks, servers, storage, applications, and services) that can be rapidly provisioned and released with minimal management effort or service provider interaction. (Mell & Grance, 2011, p. 2)

This shared pool of resources has five defining characteristics:

  1. On-demand self-service, such that consumers can engage services as needed, when needed, without human intervention
  2. Broad network access over standard telecommunications technologies for heterogeneous computing platforms .
  3. Resource pooling, otherwise known as multi-tenancy, in which numerous
    service consumers share hardware, software, and other resources, dynamically assigned according to demand, and without the consumer having knowledge or control over the physical location of said resources.
  4. Rapid elasticity, which gives the impression of unlimited resources to consumers who can scale demand up or down without prior negotiation
  5. Measured service, which helps optimize the allocation of resources by providers and charges consumers only for what they use

Cloud services are generally delivered in one of three primary service models: Software, Platform, and Infrastructure as a Service (SaaS, PaaS, and IaaS, respectively). The differences can be subtle, but it is largely accurate to say SaaS is software, PaaS is hardware, and IaaS is both. Thus, consumers have a range of options over which to engage software services offered by the provider, use the provider’s hardware to run the consumer’s choice of software, or use the provider’s hardware/ software platform as if owned by the consumer.

Deployment models range from completely private clouds to shared (community cloud), public, or combinations of the above (hybrid cloud). While a private cloud sounds suspiciously like on-prem service (and does, in fact, include it), the resources on the cloud may be owned and operated by a third party, on- or off premises. Thus, a careful distinction must be made (however fuzzy it may become) between how the company chooses to provide internal BI services and how the services are offered. In this chapter, any portion of an internal BI service not owned and operated by the consuming organization is considered a hybrid strategy, which could be offered in a private, community, public, or hybrid cloud environment.

In a pure cloud strategy, companies perform all BI activities, from collection to storage to analysis, using computing resources and analytics software offered by network-based providers (Thompson, 2009). The rise of big data and the increasing affordability of processing and storage have been major drivers for the rise of cloud BI. As the volume of data generated by firms and the affordability of computing power both grow, more and more companies are finding it attractive to outsource BI services to external vendors.

Many large organizations not traditionally in either the technology or technology services sectors now offer access to their vast computing and storage resources to provide almost any level of service to numerous clients. Companies like Amazon, Google, IBM, and Microsoft have been consistently among the industry leaders in cloud services for several years (Darrow, 2015). These established firms can leverage existing technology investment and expertise in IT/IS service provisioning and generate more market opportunities for themselves. Newer players such as CenturyLink, 1&1, Rackspace, and a multitude of smaller national and regional cloud providers (Liu, 2016) compete vigorously in what is sometimes referred to as the service as a service (LeMerle, 2012) arena, which also uses the SaaS acronym. Others prefer the term everything as a service, abbreviated as XaaS.

Gartner’s Magic Quadrant reports describe a dizzying array of services from a number of players, including professional services companies like PricewaterhouseCoopers, KPMG, Ernst & Young, Deloitte, traditional software developers like SAP, and international players such as Tata, Wipro, and Tech Mahindra (Heizenberg, Lo, & Chandler, 2017).

Some of the benefits of a cloud strategy are greater business agility, scalability, platform flexibility, enhanced security, and cost advantages. Agility is a firm’s ability to respond effectively and quickly to unanticipated change (Goldman et al. 1995), and a cloud strategy enables companies to experiment with new BI services without undergoing infrastructure modifications and committing extensive resources (capital, technology, or human). In turn, this allows them to “fail fast” and learn about useful BI service features in shorter cycles. Scalability is the capacity to grow services on a large scale quickly and in consistent fashion and then be drawn down to meet lower demand without sacrificing performance or compromising capital investment.

Platform flexibility is the ability to quickly take advantage of multiple platforms offered by service providers, which allows companies to generate greater numbers of digital service options (Sambamurthy, Bharadwaj, & Grover, 2003) to develop new products and services and increase speed to market. Business continuity and security controls offered by service providers are critical, and companies may choose tested and certified offerings which match or exceed their own ability to provide enhanced security. Lastly, cost advantages include the on-demand nature of these services and of their fee structures, which remove capital investment constraints for firms and enable targeted investment in BI services with quick ROI.

Among the drawbacks of a cloud strategy, one may consider the loss of opportunity to develop in-house technical expertise with the technologies required to provide BI services, a trade-off between complete customization and limited configuration based on provider capabilities, dependence on external security controls environments, and the potential for more difficult egress from a service provider due to closed data architectures and formats.

Hybrid Strategy

In a hybrid strategy, companies provide some BI services using existing on-prem infrastructure, and other BI services using cloud providers. There are almost limitless combinations of on-prem and cloud combinations to choose from, and they can start with very small expenditures. Firms may maintain complete control of data collection and only outsource analysis, or do any of the data collection, storage, access, or analysis on the cloud and complete the rest of the tasks on-prem. A company may even choose a different approach for its different business units, product lines, or physical locations.

Not surprisingly, a hybrid approach may include all benefits and drawbacks typically found in on-prem and cloud strategies, and not necessarily in best-ofboth– worlds combinations. A company may find the agility provided via cloud services to be a tremendous opportunity but fail to develop a strategic base of expertise in- house by providing its BI services externally. At the same time, a company could be in complete control of its cloud-based data access and controls environment but choose a technology platform which makes it extremely difficult to switch providers later in time.

The next section illustrates successes in the delivery of internal BI services using on-prem, cloud, and hybrid approaches and issues limiting their growth.

Industry Opportunities and Issues

A review of current practices and industry reports shows a number of practices driving the success of internal BI service delivery by organizations of various sizes and scopes (Heizenberg et al. 2017). Many service providers deftly integrate business advisory expertise with deep and/or wide-ranging technical skills to successfully develop appropriate service platforms for their clients. IBM is noted for its wide BI product capabilities and its history of success with its clients.

Service providers which emphasize cloud-based over on-prem solutions have a wider array of options to offer clients, along with a range of attractive pricing models. Oracle is offered as an example of a company with wide-ranging on-prem and cloud solutions aimed at making the transition to cloud-based services easier. Amazon’s web services platform (AWS.com) is one of the dominant players in the cloud arena, partly because of its many product choices, and the flexible pay-as- you-go options for its offerings.

The largest professional services organizations, also known as the Big Four (Deloitte, Ernst & Young, KPMG, PwC), offer services on various industries, supporting their central staff with international resources from a strong network of member firms. The Big Four professional services firms have extensive global networks: Deloitte employs more than 240,000 people in 150+ countries, EY (ey.com) reports over 230,000 professionals in over 150 countries, KPMG (kpmg.com) has over 174,000 people in 155 countries, and PwC (pwc.com) employs more than 220,000 associates in 157 countries. Partnerships with local resources can lead to innovation and service upsell due to the providers’ ability to deliver locally on projects of greater complexity, involving multiple technical platforms, lines of business, and locations (Heizenberg et al. 2017).

Large organizations with many capable partners can deliver sophisticated service portfolios focused on a number of different industries and become trusted strategic partners for organizations. There are opportunities for smaller providers who can be nimble and respond faster to specific needs. Companies such as KPI Partners, CBIG Consulting, Protiviti, and many others do not have the number of employees or the global network of partners the Big Four have but offer well-regarded services.

A number of common threads regarding barriers to the success of internal BI service provision emerge. The 2017 Gartner Magic Quadrant report shows several of their top-ranked business intelligence and analytics vendors possess excellent business advisory depth or deep technical skills. The report also states deep expertise in both areas is much less common. Other challenges range from issues with system integration expertise, to lack of depth “on the bench,” leading to scarcity of knowledgeable technical experts to respond quickly enough to challenges. On the business advisory side, thought leadership, simple consulting skills, project and change management, or innovative solutions are listed by clients as commonly found barriers.

Service provider size can compound these problems. Too small, and providers are often unable to balance staff deployment with needs, provide quick turnaround, or satisfy demand for domain expertise. Gartner cites evidence that regional providers face problems with consistency of execution, methods, or best practices across geographies. For example, the 2017 Magic Quadrant report states regional European, Canadian, and Indian providers and system integrators frequently deal with inconsistent engagement scope and bench marks, lower-than-expected business deliverables, better regional than global results, and cultural fit issues.

Large providers can become riddled with bureaucratic procedures, coordination issues across locations or business units, and still be spread too thin, and regardless of size, the greater the complexity of the projects providers take on, the higher the price tag for organizations.

Vendors with deep technology expertise face issues beyond project or change management challenges. For organizations with deep technology expertise with one or a few platforms, or whose BI services and solutions are based on their own proprietary platforms and applications, skill gaps with third-party tools and services present problems for customer organizations.

Customer organizations also need to be on the lookout for data egress strategies when establishing service contracts. Proprietary platforms, or even the way contracts are created and signed, can leave the organization locked into data formats or contract lengths which can jeopardize growth or service improvements. This lock-in condition (Shapiro & Varian, 1999) is problematic for the firm because data migration can be a significant barrier to later growth. Economic models show this kind of lock-in is not always predictable, particularly in markets subject to increasing returns, as technology markets are (Arthur, 1989), and that equilibrium in increasing returns markets can often be suboptimal.

The International Institute for Analytics (IIA), an independent research and advisory firm, found that the biggest barriers to the adoption and effectiveness of BI are difficulty turning insight into action, lack of qualified talent, quantifying value of BI efforts, organizational culture, and a lack of upper management support. The findings result from a 2016 survey of over 300 mid-market and large enterprises across a range of industries (IIA, 2016).

Software developers and service providers enjoy touting the ability to customize their products/services to a client’s needs. However, one quickly finds there is a large gap between customization and the ability to configure services using limited options. While configuration options in software applications and services can be quite extensive and may even appear endless, there is an end to a customer’s choices. Typically, customization of a software application goes beyond the ability to configure it using developer-provided options and require software (re)engineering to develop extensions to off-the-shelf capabilities.

When BI services are developed or operated on-prem, the firm’s ability to completely dictate the capabilities offered by these services is greater. This ability is greatly reduced, at best, when software moves to the cloud, because control over software functionality moves to the vendor and multi-tenancy requires software capabilities to remain consistently available to all customers. Customization remains an option in SaaS environments, but multi-tenancy changes vendor responsibility to its customers (Song, Chauvel, Solberg, Foyn, & Yates, 2017). Configuration, as extensive and beneficial as it may be, is simply not a substitute for customization.

Future Industry Trends

Despite the high levels of interest, importance, and investment in BI in the past decade (Kappelman et al., 2015), service maturity levels remain relatively low. Findings by Forrester Research suggest more time is necessary for providers to develop and converge on industry best practices, incorporate lessons learned, develop technologies and processes suitable to business needs, and determine the right mix of centralized versus distributed offerings (Evelson, 2011).

The growing need for BI-specific technology, services, and data governance at enterprise scale versus the locally optimal solutions prevalent in the market at the moment will continue to drive service maturity. As service complexity needs grow, the initial rush to adoption of disparate tools without an overarching architecture will subside, and more mature architectures will emerge.

This has been a long- standing part of the cycle of maturity of corporate IT architecture, which Ross (2003) describes so very well. She identifies four stages of corporate IT architecture with identifying capabilities and increasing levels of service maturity starting with separate, local investment by individual business units without a guiding plan (application silo stage) and proceeding to a controlled list of standardized technologies. Next, a stage she calls rationalized data, which includes the standardization and documentation of not only data formats but also business processes supported by IT, and finally, a stage which enables much greater agility through technology, data, and process modularity brought about by standardization.

Greater architecture maturity and capabilities will also reduce what Gartner calls technical debt (Sallam et al., 2017), known elsewhere as lock-in: hasty technical decisions regarding analytics platforms which show quick ROI, become entrenched, and leave the organization stranded or in a more difficult position to grow BI services in the future.

Growth for BI services in the areas of advisory, systems integration, platform selection or development, implementation, and operations will be driven by several trends. These include data governance, tool simplicity, automated knowledge discovery, faster and/or near real-time ETL activities, and mobility. Similar to the development of corporate IT architecture described by Ross (2003), BI services (on-prem and cloud) have begun a process of standardization of capabilities and tools and are evolving from best-in-class applications offering little compatibility with other tools. The eventual standardization of data meaning, aided by mature data governance practices and technologies such as the Extensible Markup Language (XML) and by industry data standards based on XML, will add pressure on BI service providers to compete on tool accessibility via simple interfaces.

Similar to the development of corporate IT architecture described by Ross (2003), BI services (on-prem and cloud) have begun a process of standardization of capabilities and tools and are evolving from best-in-class applications offering little compatibility with other tools. The eventual standardization of data meaning, aided by mature data governance practices and technologies such as the Extensible Markup Language (XML) and by industry data standards based on XML, will add pressure on BI service providers to compete on tool accessibility via simple interfaces.

The increasing power and affordability of IT will also enable a new generation of automatic pattern discovery by BI engines, allowing users to spend less time manipulating data and more extracting actionable insight from it. Simpler insights extend beyond simple descriptive statistics and into significant correlations, meaningful clusters, components of multivalued constructs, etc. Analytical bias is present when existing knowledge influences the conclusions derived from a data set.

ETL activities will also be performed faster and more efficiently, and data access to mobile platforms will be more ubiquitous. Demand for simple, easy-to-use BI platforms which provide greater accessibility to sophisticated data visualizations and which increase business agility will continue to be high. Both of these trends will be enabled by the continued affordability and increased performance of IT components.

Data governance is a subset of IT governance, the management and control of IT planning and implementation to support business objectives (Van Grembergen, 2002). Data governance concerns itself with the technologies, policies, and practices which govern the management of data and other electronic assets (Soares, 2015). Data governance issues include consistent data meaning (semantics) throughout the organization, as well as the more traditional governance concepts of ownership, data operations permissions, etc. Data semantics are an important, but nearly invisible, enabler of BI services. Semantics refers to technologies and processes which enable consistent, well-defined data meaning, allowing systems and people to work with that data based on common understanding. Semantics express portable data and processing rules about the data so that differing terms by separate systems referring to the same concept can be reconciled automatically (Berners-Lee, Hendler, & Lassila, 2001). Data analysis, insight extraction, and automated pattern recognition will all improve as the maturity of semantics policies and technologies matures across industries and within individual firms, fueling growth.

Demand will also rise rapidly for BI services and platforms which can handle the increased speed of multistructured data, i.e., structured data suitable for traditional relational applications, and multimedia, multisource “big data.” Trusted data sets are important for all BI activities, and the speed, amount, and variety of big data (McAfee & Brynjolfsson, 2012) streaming into corporations are only accelerating. Lastly, mobility of BI services means more than simply their availability and performance on mobile devices. It will include the ability to provide formats and application links (APIs) so that analysis and results can be inserted into a variety of applications, portals, feeds, and even data-based products (Davenport, 2013).

A couple of factors may also work to slow down full migration to cloud-based services. Data governance practices and/or proprietary data formats may prevent a company from completely moving to a cloud strategy. Data governance practices, formal or not, may make it difficult to overcome the inertia of keeping data “in house.” Gartner research (Sallam et al., 2017) shows only 51% of participants in a recent survey about cloud services intend to move to a cloud strategy, an uptick from 46% in 2016, but far from a categorical shift. One must be careful when evaluating such as a result, as intent (as measured in the survey) obscures action, timeframes, percent of existing services being migrated, and even the extent to which a cloud service is truly a cloud effort and not a hybrid solution.

Secondly, work by Forrester (Evelson, 2011) defined untamed business processes as informal, localized, human-dependent, and cross-functional. Untamed business processes create difficulties for organizations looking to standardize BI services (as consumers or providers), because these untamed processes are often not highly visible, and they defy traditional formalization and codification processes and require custom approaches. The research report calls for an agile approach which combines technologies, tools, processes, methodologies, and organizational structures to increase the flexibility with which BI consumers adapt to the changes required by formalized and untamed processes. Not surprisingly, the features of agile BI options in the report include simplicity, automation, consistency, and mobility.

Academic research will continue to be hard-pressed to keep up with these developments, and its largest contribution to this rapidly changing field may be the reinforcement of solid concepts which can accommodate changing technologies while retaining their theoretical validity. Well-understood ideas in project management, customer requirements identification, and data governance, to name a few, combined with new technology developments and emerging data creation and consumption models are key to the formation of BI professionals.

One must remember the cyclical nature of developments in information technology and information systems and that, while BI services are certainly not a fad, the time will come when they are also not front-page news, and new concepts, technologies, tools, and methods will dominate the news cycle. Academic research contributions will continue to add value when theoretical concepts are strong and flexible enough to accommodate these changes.

Bridging the Research-Practice Gap

Researchers in the IT/IS field have made multiple calls for greater technology engagement in academic research (Orlikowski & Iacono, 2001) and to expand how theory is generated to include multiple research perspectives (Orlikowski & Baroudi, 1991), not just the dominant view of most mainstream academic research (Lee, 2010). Other IS academic researchers have observed that academic IS research is in danger of talking about itself, mostly to itself (Keen, 1991), and that three primary effects are to blame: the separation of academia from industry practice, the language and tone of academic research, and the choice of publication channels for the research (Lang, 2003).

Now we are closing with suggestions for how to foster two-way dialogue between academic and industry sectors to develop relevant research which can influence pedagogy and provide the necessary skills not widely found in currently graduates. Partnerships between academe and industry in the IT/IS field are strong but are greatly concentrated to a few programs and researchers. Dialogue can help identify and develop curricular programming or technology infrastructure at universities to help develop skills in business students to define and answer current problems. The suggestions are grouped in three categories: recruiting, pedagogy, and research.

The primary findings of a 2009 survey on the state of university-level BI curricula by the Association for Information Systems were that BI programs were not yet widespread, needed to provide a broader range of skills using interdisciplinary approaches, and require better alignment with industry needs (Wixom et al., 2011). College instructors reported a need for better access to data sets, case studies, textbooks, software, and technical support and training to help them deliver appropriate content.

The survey was repeated in 2012 and found demand for skilled BI professionals still outpaces supply despite a tenfold increase in the number of BI degree programs worldwide (Wixom et al., 2014). In the era of Big Data, survey results show foundational skills like communications, business expertise, basic analytics, and data management, and SQL remain the most sought after. Employers participating in the survey express dissatisfaction with the practical experience of graduates. The survey also shows that employers value internships and technical skills like dashboard development and analytics practicums the most (Wixom et al., 2014).

Firms can help universities in each of these areas by establishing internship or co-op programs, hosting or sponsoring case study or data analysis competitions, providing financial support for curriculum development, via in-kind donations of hardware/software, or by offering free access to white papers, reports, or data sets for use in classrooms.

Employers report specific tool certification is not a critical element of the skills mix they seek (Wixom et al., 2014), but offering funds or training toward certification for instructors in desirable tools and methods would be useful in curriculum development. Guest lecturers and hosting faculty and students for in-office presentations, networking opportunities, and best practices demonstrations can also spur curricular innovation. Lastly, firms can assist in curriculum improvement efforts by participating in departmental advisory boards.

Academic corporate partners can increase visibility of their recruiting efforts in several ways. Many faculty members appreciate having access to experienced staff who can deliver guest lectures on special topics or skills or who can lead software or analytics demonstrations. This raises the visibility of the company on campuses and generates excitement among students, who see companies whose employees deliver guest lectures as potential sources of employment.

Visits to the company’s local offices where students learn what the company does and which skills are valued in potential employees are also very useful. The obvious approach of formal internship or co-op partnerships is a tried-and-true method, but too often companies overlook the value and impact of sponsoring student societies and honors societies. Very small investments can lead to tremendous visibility with top-performing student leaders: a few hundred dollars to cover expenses for induction ceremonies, informational meetings, or community service events go a long way on campus. Scholarships and book stipends for students in leadership positions can also greatly enhance the company’s prestige in the eyes of future professionals.

Lastly, calls for increased relevance in IS research have been made for decades (Robey & Markus, 1998). Lee (2010) argues that the starting point for IS research does not need to be theory but rather the observation, explanation, and documentation of the art & craft of IS professionals. From this, a theory can be derived, but more importantly IS research can document and explain practices as “consumable” for practitioners (Robey & Markus, 1998). Research partnerships between industry and academe already exist but are not as widely accessible as they could be.

Funding for work of specific interest to organizations, access to employees or consortium members for the purpose of collecting data for the development of theoretical models, and access to existing corporate data sets would be extremely useful to both academics and corporate partners. Participation in research design and execution data analysis and synthesis, and publication of results are all valuable commodities to academics and of great use to industry partners. More creative outlets for research cooperation are faculty-in-residence opportunities, in which faculty members can work part time or take a sabbatical to work full time at an organization to develop and deliver research and other products of interest to the organization.

Business intelligence has demonstrated its value to organizations, but as with other IT investment, extracting this value is neither easy nor automatic. Collaboration between academia and industry can result in the improvement of skills for graduates so high in demand, a more prepared professional workforce, and research products which are of more immediate practical use to organizations.

Tags

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Close
Close