×

Become a part of the TegCircle!

The TegCircle isn’t just an email list. Emails are most often a one-way communication. Circles are different. Circles are interactive

Tegrita is committed to knowledge-sharing, making authentic connections, and contributing to the digital marketing community.

As a part of the TegCircle you’ll enjoy:
  • exclusive access to select online events
  • a library of proven practice templates, tools and resources
  • inclusion in industry research, discussions and other opportunities to knowledge share
  • being part of our community of awesome Modern Marketers!

Marketing Automation and Predictive Analytics: Evaluating, Integrating, and Succeeding

By December 14, 2020Marketing Strategy
Marketing Automation And Predictive Analytics: Evaluating, Integrating, And Succeeding

Ten years ago, as an engineer by education, taking a leadership role in a marketing organization was half-exciting, and half-crazy. At the time, marketing organizations had started its steady evolution from the department of party planners and arts & crafts, but still maintained a reputation of being slightly (or significantly!) less technical.

I knew, however, that the skills an engineer brings to the table-process analysis, problem solving and data orientation and the like-would be useful in a profession that was becoming more data-driven by the day.

Proof of the data-driven future of marketing has been amply demonstrated by the emergence of marketing stack foundational technologies like CRM and Marketing Automation. And the usefulness of an engineering skillset comes into play whenever one needs to evaluate and deploy adjacent technologies-such as Predictive Analytics.

Evaluation: Is Predictive Marketing Analytics Right for Us?

Predictive analytics, when applied to marketing, promised to mine our track record of successful sales and identify lookalike prospects that were most likely to also become customers.

I attended a major marketing conference, a big user event put on by one of the major marketing automation vendors and attended sessions to learn about the emerging predictive analytics technology. Engineers are ever curious about new technology.

While the presentation was impressive, I couldn’t help but think to myself that this was overkill for our business. We prided ourselves on customer intimacy. We built out our lead-scoring system in our MAP. We had a well-formed Ideal Customer Profile, based on the input of our leadership and sales team, all seasoned experts who knew the market and who would be a good client.

I left the conference without the shiny new technology in my marketing stack.

Yet, sitting through our pipeline reviews over the next several months, a nagging thought continued to go through my mind—why, if our ICP is so well defined, do our sales reps continuously spend months prospecting accounts that never close? Isn’t this exactly the type of problem that predictive analytics was designed to solve?

I signed on for a trial with one of the emerging providers, Everstring, which promised to build a predictive model based on two years of our closed-won account data coupled with their data incorporating some 10,000 demographic, firmographic and intent data elements.

Model built, it was time to create a framework for evaluation. Our company wasn’t one to invest and wait a year to see if the new technology provided results—our leadership wanted some idea of what the results might be…before plunking down cold, hard cash.

We already knew the sales results of every opportunity from the previous year. I looked at prospecting data (getting meetings from assigned leads) and sales data (closing them). Our baseline included 1935 sales prospects, which resulted in 116 scheduled meetings, and created 32 opportunities. To evaluate how predictive analytics would have guided our sales behavior, we took the model created using our closed-won data, and applied it to our body of work.

The model, when applied to all our data, created a 0 to 100 score, and we applied a simple A/B/C/D grading scale.

But would the model show real difference between leads? Or would it show that the model didn’t define performance beyond random variability?

Here’s where things got interesting.

Although A-rated leads accounted for only 15% of our prospects, they resulted in 35% of conversions.

A full 85% of our opportunities were created from A- and B-rated leads.

Yet we look at prospecting, our SDR team spent 55% of their time on C- and D-rated prospects—and they actually spent more time with D- leads than A- leads!

Conclusion: Our sales team was spending a whole lot of time chasing business that our predictive model says we won’t win. If we could get our team to invest that same time on higher rated leads, our sales would increase significantly.

Moving Forward – Integrating Technology with Process and Teams

First, there is not a lot to say about the technical integration of a predictive analytics tool with the CRM and Marketing Automation platforms. Modern APIs, customer success programs, and plug-in apps make the formal integration of tools a relatively simple and straightforward process.

The challenges of integration really fall into two categories: Process. And People.

We start with the people side and a difficult conversation: telling a team of seasoned sales professionals that their experience-driven, but largely gut-feel, ideal client profile didn’t focus efforts on the prospects that were most likely to turn into customers. No matter how much data you bring, there is some defensiveness over introducing change to a process that has seemingly worked well in the past.

The solution is to create a well-defined process that takes into account the concerns of all parties and helps guide them to more successful outcomes overall.

First, we had to make the predictive model-based lead grade information transparent and available—every CRM object displayed the model score for every lead, contact, and account. We built automation rules to run every new lead through the model upon creation—and used that information to guide the next steps.

Second, marketing and sales worked together to develop a set of guidelines that leveraged our predictive model, but still permitted some activity with seemingly lower rated leads if they showed interest.

  • Inbound: Follow-up with everyone—even Cs and Ds.
  • Marketing Qualified: Only A and B assignment to sales permitted. Cs and Ds could still be included in nurture campaigns, but would only be assigned to sales if they explicitly requested a meeting
  • Sales Cold Prospecting: Start with A leads. Then move to B. Period.

Another key to adoption was that our sales team saw that we use of the model in our own marketing programs to guide our qualification efforts. No more dumping a list of random webinar attendees or white paper downloaders into a follow-up cadence. We created an SLA for ourselves and would apply the same model on the marketing-to-sales handoff.

Creating A Mutually Successful Process

At a high level, the goal of our predicative marketing analytics investigation was to help our sales team be more successful

And the results in the first full year following adoption showed we succeeded: we saw a 38% increase in pipeline and a 25% increase in revenue…with no increase in the size of the sales team.

By giving the sales team the insight to avoid wasting time on prospects unlikely to close, it’s like we’re creating more hours in the day for productive activity.

The following two tabs change content below.

Steve Susina

An engineer by education and a marketer by choice, Steve currently leads the marketing team at Lorman Education, where he is responsible for demand generation, content marketing, and managing the company brand. His previous roles spanned IT/SaaS, publishing, eCommerce and telecom. He loves the data-driven aspect of modern marketing, and is a four time Marketo Champion. Steve is a Chicago native who still resides in the area, holds a BS in Electrical Engineering from Marquette University, and an MBA from the University of Pittsburgh.