Outsourcing and the Promise of Big Data

Arvid Tchivzhel, Director & Chuck Currin, Principal Data Architect, Mather Economics
731
1178
231
Arvid Tchivzhel, Director & Chuck Currin, Principal Data Architect, Mather Economics

Arvid Tchivzhel, Director & Chuck Currin, Principal Data Architect, Mather Economics

With 2016 in full swing, Big Data and every label that it connotes (both good and bad) is here to stay. A quick check on Google Trends reveals interest in the subject is at an all-time high, reaching a peak in March 2015 after two years of rocket-like growth. The search term “Hadoop” began gaining popularity nearly three years before Big Data came to be the buzzword it is today. If there needs to be more evidence that Big Data is here to stay, note that “legacy” systems such as Microsoft, Oracle, IBM and Teradata to name a few, have either added new features or acquired firms with roots in Big Data to ensure their market share is not upended by OpenSource consultants.

“The most important first step of any Big Data project is to know how the project will raise net profit”

Unless you were one of the first to adopt NoSQL storage and processing shortly after the Apache Hadoop project was officially launched in 2006, it is likely you started feeling the pressure to invest in Big Data after 2012. Many companies proceeded with the investment but suffered due to poor planning, poor execution or unclear goals of how the investment was meant to be used to improve profitability. Often there was a lack of skilled people who could make use of the technology in a way that affected the bottom line.

Outsourcing YOUR Profit

Outsourcing has been a practice commonly referenced since the 1980s, although there are probably examples of varying degrees long before then. There is nothing inherently wrong with outsourcing, and it actually can save significant costs when done correctly for specific needs. Cloud computing has been one of the most successful initiatives over the last few decades (both in practice and in marketing) and has driven down the costs of building and maintaining all aspects of business infrastructure. In the Big Data world, rapidly changing technology and a nearly endless stream of acronyms and buzzwords, including “Data Lake”, “the three V’s”, “HDFS”, plus about many others, led many CTO’s to look for outside specialists and developer shops to help identify and implement Big Data projects. Additionally, the high cost of hiring a full-time employee or a group of highly skilled developers versed in the latest Big Data technology seemingly makes outsourcing a reasonable solution.

However, while outsourcing IT and infrastructure may be viable (look at Netflix as the example here), outsourcing talent, analytics and vision are a recipe for wasted dollars and a failed Big Data project. A study published by Wikibonin August 2013 (Floyer, 2015) determined that there was only a $0.55 return on each $1 spent on Big Data projects. Further study reveals two major reasons for a negative return were:

1. A lack of relevant business use cases

2. A shortage of skilled talent (Kelly, 2013)

Yes, Netflix outsources its IT infrastructure, but most of its brainpower and analytics are kept in-house. Problems with Big Data projects are not caused solely by outsourcing but bythe nature of how outsourced Big Data projects are managed and how some developer shops sell their services.

The Developer Shops

Imagine the following scenario: your CEO wants to improve ROI using the Big Data tools they have heard about from other CEO’s and case studies. Your assignment is to build a Big Data framework within three months and begin impacting the bottom line in six months. The first step you take is to research the skillset of successful Big Data operations, followed by researching the vast range of competing tools and technologies all claiming to be the one tool that will solve every problem imaginable. After some time, you finally find some developer shops that have everything packaged together: the talent, project management, technology, prior experience, etc. You evaluate and narrow the field to a handful of shops that each send you a statement of work after scoping out the project. You pick one and get started, after which there is a flurry of activity, project updates, check-ins… this is easy.

Now come the problems: your Big Data infrastructure has been built by infrastructure specialists, your data transformation process has been built as scoped out during your conference call three months ago. With the set-up phase wrapping up, the ongoing support costs now seem steep, and the execution does not meet expectations. You call another developer shop to ask their opinion and learn about all the wrong things done by the shop you chose. No worries, everything can be fixed for another round of investment.

Something clearly went wrong, but you are not sure where. In most cases, the problem was trusting a group of talented developers, programmers and specialists to understand your business and industry enough to guide the execution of the project. The difference between outsourcing IT to the cloud and outsourcing a Big Data project is that Big Data cannot be generalized.Big Data is not the Hadoop cluster or the ETL process, but is rather the insights that yield incremental profit.

The Big Steps to Big Success with Big Data

1. Identify the business case(s) and revenue impact

2. Hire a business analyst who is a strong problem solver and critical thinker

3. Make every decision by asking how #2 will achieve #1

The most important first step of any Big Data project is to know how the project will raise net profit. In the simplest terms, achieving this goal comes from new revenue or lesser costs. So, if you are ever unclear on this first step, put the tactics on hold and go back to strategic planning. Identify two to three items that are clear and quick wins, then identify two to three items that are more difficult, and finally find a “dream” item that each contribute to the bottom line. Once you have the vision and strategy in place, hire a business analyst who can serve as the point person and project manager to ensure the technology execution meets the business needs. Lastly, once you have the use cases and an internal steward in place, choose a developer shop that has experience and that can customize to the needs of your business.

The “Chief Revenue Officer” is a job title that originated from Silicon Valley and, according to Google Trends, seemed to appear overnight in 2011.

A key factor to success is ensuring the CRO is either leading or heavily involved in the implementation and definition of use cases for a Big Data project. Focusing on ROI and real dollars is the only way to prevent an underwhelming outcome after millions of dollars and thousands of hours have been spent to take advantage of the powerful insights derived from Big Data. Strong talent and people should not be outsourced.

Read Also

Open Source & Cloud Native: Why should Your Business Care?

Open Source & Cloud Native: Why should Your Business Care?

Ken Owens, CTO of Cloud Platforms and Services Group, Cisco Systems
Still Fear Moving to the Cloud?

Still Fear Moving to the Cloud?

Emiliano Diez, VP, Cloud Services, Campus Management Corp
Why Professional Open Source Management is Critical for your Business

Why Professional Open Source Management is Critical for your Business

Gil Yehuda, Senior Director of Open Source and Technology Strategy, Yahoo
Open source Solves and Supports Today's Business Needs

Open source Solves and Supports Today's Business Needs

Craig Paulnock, Associate VP of Digital Product Management and Innovation, YMCA of the Greater Twin Cities