Chris Lee, Chief Commercial Officer of Logistical Labs, recently wrote an article for American Shipper about effectively aggregating, analyzing, and monetizing big data. Read the article below.
“My dear, here we must run as fast as we can, just to stay in place. And if you wish to go anywhere you must run twice as fast as that.” ― Lewis Carroll, Alice in Wonderland
By 2020, there will be a staggering 44 zettabytes, or 44 trillion gigabytes, of data in the digital universe.
Many business leaders understand that investing in big data can lead to efficiency gains, a better customer experience, and ultimately higher sales. Companies are spending millions of dollars on tools meant to collect and analyze information, yet few have seen a tangible return on investment. Why? Because investing in analytics can be useless, or even harmful, if employees do not have a simple way to incorporate data into their day-to-day decision making. Companies need a clear business intelligence strategy, or else they run the risk of getting lost in a never-ending data rabbit hole.
Chasing the Elusive White Rabbit
3PLs and brokers today are stuck in a cycle of chasing loads, agreeing to fixed-price contracts with shippers, and getting burned when capacity tightens and rates rise. Margin compression is a daily concern, and the key to a failsafe pricing strategy remains just out of reach.
In addition to pricing challenges, the traditional sales process is time consuming and involves several steps: prospect a customer, analyze their needs, research cost, research sell rate, respond to the customer, source the load, etc. Efficiency and revenue per employee are stagnant, and turnover is rampant. What if brokers could buy and sell in a more efficient and accurate manner? What if sales reps could talk to more customers per day, with fewer phone calls, mouse clicks, and keystrokes? Even shaving off seconds per rep, per day could have a big impact over time.
Many see data and technology as a potential solution, but few know how to effectively aggregate, analyze, and monetize it.
There is No Magic Pill or Potion
Many business intelligence tools and platforms promise big improvements to your bottom line, but it is important to remember that a product is useless without a clearly defined challenge to overcome. Too often business executives end up with an expensive IT solution that is waiting for a problem.
Companies must start with the problem and work toward a solution, not the other way around. Once a problem is identified, what is the end goal? Decrease the time it takes to book a shipment? Increase revenue? Acquire more customers? Not taking the time to strategically establish desired outcomes before purchasing technology leads to wasted time and resources and subpar results.
Further, too much data is not big data. Garbage in equals garbage out, and companies that start pumping inaccurate or meaningless information through their system may find that their sales force avoids using it altogether. In fact, one in three business leaders do not trust their own data. Dirty data leads to algorithm aversion within an organization—or worse, employees making poor decisions based on data that was expensive to collect.
In some cases, reliable data exists, but it is nearly impossible to locate. Like a library with no card catalog, a system with no coherent or accessible structure will slow employees down rather than help them. Remember, sales teams are experts in sales—not statistics. The key to making data analysis work is to simplify it. Hiding complex algorithms behind a user-friendly dashboard that guides users toward decision-making can make all the difference.
Surviving (and Thriving) in Wonderland
Imagine a world where sales reps effortlessly benchmark prices against both historical data and aggregated market indexes with the click of a mouse. What if they could simultaneously run pricing comparisons across modes and see how real-time market conditions affect those benchmarks? With the right analytics strategy and a dynamic technology solution in place, that world exists. Forward-thinking 3PLs and shippers are using big data to inform pricing, optimize mode selection, calculate transit times, measure wins and losses over time, analyze what drives revenue up or down, and more. The more variety and velocity of data they collect, the better. In other words, more data captured leads to even more accuracy, speed, and autonomy until eventually companies can focus less on administrative tasks—only having to manage by exception—and more on strategy and growth.
The concept of the “digital broker” has emerged in recent years, with many technology-focused transportation startups being hailed as “Ubers for Freight.” These new companies have flashy websites and slick mobile apps, but they fall short when it comes to critical industry knowledge and partnerships. Because of this, the more likely disruptors will be those players already in the industry. Logistics companies that embrace big data should not be afraid of these newer startups, because the same technology they are promoting is likely already available to them.
In today’s freight market, the need for a dynamic, multi-modal pricing platform is more important than ever. Recent months have proven that pricing does not always follow the same ebbs and flows year over year. Hurricane season, economic and political shifts, and the impending ELD Mandate are just a few examples of factors impacting the current capacity market. Logistics providers need a way to establish custom rule sets that calculate the most optimal price. Do you have an analytics platform in place that can make necessary adjustments when market changes arise?
Big data can either act as a secret weapon or leave you madder than a hatter. Without a way to quickly access, parse, and learn from data, supply chain businesses may have a hard time competing. Those that put analytics at the forefront of their operations will not only survive Big Data Wonderland, they will hold all the cards.