From cloud migration to real-time analytics

From cloud migration to real-time analytics

Clinton McFall, Davey Resource Group’s Director of Technical Services, talks cloud migrations, data leadership and pushing real-time insights to the limit with Fivetran HVR.

Clinton McFall, Davey Resource Group’s Director of Technical Services, talks cloud migrations, data leadership and pushing real-time insights to the limit with Fivetran HVR.

Cloud migration

More about the episode

Davey Resource Group (DRG) is a subsidiary of Davey Tree Expert Company, the eighth-largest employee-owned company in the country. DRG does everything from auditing utility poles to wetland restoration to real-time telematics data collection. 

Clinton McFall, Director of Technical Services at DRG, manages a team of geographic information system (GIS) experts, solutions architects and data scientists. He’s deeply experienced in enterprise-scale projects, including work with SAP ECC, S4 HANA, proprietary software and cloud migrations.

McFall and his team tackle complex data challenges, including client data isolation across large data lakes, managing asynchronous data collection from field technicians and safely leveraging GenAI to explore GIS data in new ways. 

Learn how DRG’s partnership with Fivetran has helped them put real-time data to work in real-world use cases, from job site safety to conservation.

“Being able to generate those reports on a daily basis is powerful, but being able to have it in real-time, that's a different ballgame.”
— Clinton McFall, Director of Technical Services at DRG

Highlights from the conversation include:

  • Optimizing cloud elasticity for short- and long-term business gains
  • How and when to eliminate legacy systems
  • Building a real-time data collection and analytics program using public, private and client data

Watch the episode


Kelly Kohlleffel (00:05)

Hi folks, I'm Kelly Kohlleffel, your host. Every other week we'll bring you insightful interviews with some of the brightest minds across the data community. We'll cover topics such as AI and ML, GenAI, enterprise data and analytics, data culture and a lot more. 

Today, I'm really pleased to be joined by Clinton McFall. He is the Director of Technical Operations at Davey Resource Group. Clinton has more than a decade of experience in geographic information systems. His expertise includes systems integration, crafting innovative enterprise solutions and he’s always making a continuous effort to maximize workflow efficiencies across Davey. 

During his time at Davey Resource Group, Clinton has led the deployment of several enterprise-scale projects, including the deployment of time systems integrated with SAP ECC and S/4HANA, the creation of proprietary production management software and the migration of an on-prem infrastructure to the cloud.

Clinton began his career in the public sector, working at a local government organization while pursuing his bachelor's from Kennesaw State University. Clinton, it's great to have you on the show today. Welcome in.

Clinton McFall (01:12)

Thank you very much. I'm excited to be here.

Kelly Kohlleffel (01:14)

Absolutely. Well, I gave a few highlights, but why don't we dive into your experience? Some of your journey on this on-prem to the cloud. We want to hit on that, leadership lessons and a whole range of things. 

Tell me a little bit more about yourself and the Davey Resource Group. There are some really interesting things going on at Davey, but maybe not everybody knows that name. If you could, hit that first.

Clinton McFall (01:42)

Yeah, absolutely. Davey is an amazing company. When I started here, the things I learned and continue to learn about what we do are still surprising to me. To give you a tour of the company so everyone understands the structure, Davey Tree Expert Company is the parent company. Davey Resource Group, where I work, is a wholly-owned subsidiary of Davey Tree. It is Ohio's largest employee-owned company. I believe we're ninth largest in the nation now. It's been kind of exciting to see us go up the top 100 list since I've been here. 

There are four main markets within DRG. Utility asset management is one of them (that's where I'm housed), utility vegetation management, environmental consulting and then mitigation services. 

Kelly Kohlleffel (02:42)

Do clients typically utilize services across those four main areas, or do you find that you’ve got a big base of clients that's just environmental consulting, just utility and asset management, etc?

Clinton McFall (02:53)

I think it's a mix, honestly. I do think that there are pockets of isolation of key specific folks that use specific markets, but in the utility space, it's definitely growing. There's a lot of cross-market pollination there. There are also a lot of cross-market service line components to that as well. 

Kelly Kohlleffel (03:15)

I'm guessing on the utility asset management side, there are multiple segments as well.

Clinton McFall (03:21)

Absolutely. There are four main segments of utility asset management. Construction services: If you're ever driving around and you see an old, beat up, maybe broken, utility pole sitting next to a new one, we’ll go out there and help service those. We'll move those attachments from an old pole to a new pole, help pull that pole and get that liability out of the way for the client. We’ll be able to provide the data and associated records back to them. 

There's another one in that same market, drop removal, where we'll work with clients and help look at their data and information to determine where they're attached to poles that they don't actually service anymore. They are paying rent on those poles, and we can help them remediate those and get those liabilities off their books as well. 

Then there’s laying fiber. For example, when new internet service providers are interested in servicing a territory or an area, we'll actually go out and string up the fiber for them and help them make all the connections.

That’s one whole market of just construction services. There are three other markets, too. We have field services within utility asset management. We essentially conduct inventories or verifications for our clients. 

It’s exactly what it sounds like. They'll ask us, “Hey, we want to go get all of our assets. We want to inventory those. We have a data model that we would like them to fit within.” So we'll go out, we’ll help them collect that information and provide it back to them so they can use it for business reasons. With verification, it’s the same thing. They'll give us data that they want verified. Maybe it's geospatially incorrect. Maybe the actual tabular information is out of date. We’ll help them update that information and help them ingest that back into their respective systems. 

For the last two, we have engineering. We do make-ready engineering. When internet service providers want to come into an area and string up on X number of poles, they think they want to go down this route. We’ll help them find the optimal routes, help them make sure they can string up on the poles that exist or if they need to bring in more poles. Questions like: What are the permitting constraints? What needs to be done there? We kind of help both sides, depending on the client. We might work for a pole owner, or we might work for somebody trying to attach to a pole.

The last one is reliability services, where we go out and help our clients inspect their assets and then do any restoration that may be necessary. If you have a pole that has decayed or was hit by a car and needs to be restored, depending on the spec and the requirements of the client in the area, we can go out and facilitate those things. 

Those are really the four main markets that utilize asset management services.

Kelly Kohlleffel (06:22)

There's a lot of physical there, but the physical and the decision making has such a dependency on the amount of data that you're able to gather and the way you're able to use it effectively on a day-to-day basis. So I’m really interested in exploring this with you. 

Clinton, you talked in the opening about several large-scale implementation projects. One of those was the migration of on-prem infrastructure to the cloud. Talk me through that. 

Clinton McFall (06:57)

From a scope perspective, it was really focused on our asset management infrastructure.

We have a lot of different footprints and different cloud providers, still some on-prem, so we're multifaceted in that sense. We have a trajectory mapped out till 2030, and based off of that, we wanted to make sure we had the runway and a solid foundation to get there.

A lot of this infrastructure was geographic information systems based, so we really wanted to make sure that we had the ability to scale up and out. Moving that specific stack to the cloud really gave us that ability to do it. To give some specifics, we can regionalize servers based on geographic areas or specific clients so that we can attune the operation to those constraints. That was really the driving factor to make some of those decisions.

We’ve been very successful with it. We've learned a lot of lessons along the way, as anyone would. The other thing I would say about cloud-based infrastructure is: test it. Try to create your infrastructure there and bake it. Let some systems go through their paces. Of course, when you throw full volume at it and throughput, that's when you're going to see where you need to make some adjustments and tune things. Be ready for that and plan for that tuning stage after you make that migration. But overall, it was a good project for us.

Kelly Kohlleffel (08:44)

I can imagine, on the utility side, that your core GIS systems really are your operational systems that you're living with every day. Where did your data warehouse, data lake and these types of workloads live, where you may be combining some GIS info with other operational info from databases or other applications? Was that on-prem then and migrated, or did you just start that out in the cloud?

Clinton McFall (09:17)

The inception of that really did start on-prem and in a more relational database structure, but as technology has changed so quickly in the last few years, the concept of the data lake has really matured. 

We still have a footprint in many areas, but we do segment specific data points based on requirements and whether or not we're owners or the clients are owners. That's something that we spend a tremendous amount of time on. That's something that we really respect. Who are the owners of the data, and are we shepherds of that data for them as we go through the project and the process? 

For our datasets that we use to really drive our operation, those are enriched in the cloud-based environment. We use tools like BigQuery. We use a variety of different BI tools as well, like Looker Studio and some things from Microsoft. We have a pretty diverse footprint, a lot of which is dependent on what our clients are able to consume, the requirements that we have from our clients and those types of things. We get to play in a lot of different toy boxes.

Kelly Kohlleffel (10:32)

When you look at the challenges on a migration project like the one you did, which sounds like it had multiple dimensions — data platforms, your core operational platforms, GIS and the three major areas that you had to take into account. You’ve got technology, you’ve got process and approach and you have the people aspect. Which was the most challenging out of those three for you during this process? What felt like the easy part?

Clinton McFall (11:05)

I would say clear communication is always a challenge on a large-scale project like that. Making sure that the team fully understands what they have to contribute. I think that's the big thing.

The other challenge with this, and I'm sure every data professional that you probably speak to is going to say this, is the data governance changes that you institute along the way.

Data governance is not this sleek, sexy thing. It's the AI/ML, the large language models, those types of things. But, those things are driven by data. The data governance behind that is a big portion, so I would say that's also a challenge. It's a bit of a change management piece and helping the organization understand, whether it be field collection or a process in the back office, those data governance components really impact the ability to scale things and leverage some of those new technologies.

There's a lot of excitement in the organization, as in any organization, around AI and ML. That's the big buzz right now, right? I think the most important thing is understanding what challenges you can apply those technologies to.

I think a lot of folks are excited by the technology and they're looking for a place to go apply it. You need to really understand the challenges you have and where those technologies organically fit so that you can build something. The challenging part to that data governance piece is: how do you help the organization understand the value so they can leverage those new technologies?

As we started pulling together a data lake for our own internal metrics and tracking, we understood that there were a lot of disparate datasets that have value when they're brought together. It isn't necessarily easy to understand how those things connect. We have to help the organization understand the value of putting the governance in place for how we capture that data, so that we can then see that value come out of product dashboard AI/ML tools.

Kelly Kohlleffel (13:26)

Are there any examples where you said, “Hey, I hadn't even thought of this combination of datasets,” that stood out to you, either back then or maybe that you're working on today?

Clinton McFall (13:38)

Yeah, absolutely. We did a couple POCs with some very interesting partners where we started pulling some of our telematics information from our trucks out in the field so we could see the locations. You could see fuel consumption, you could see idle time, you could see acceleration, braking, those types of things. We were really trying to figure out: is there a way to use that to help the operation and improve safety for our staff and our employees? 

We've actually had some very interesting use cases where we've had folks that were stuck in their vehicles with no cell service, and we were able to realize that person wasn't able to get back to the yard in time. The vehicle is still out and about, and we were able to find it, deploy some folks to help get them unstuck and get them back before the evening. We were able to tie it together with telemetrics data and traffic data.

Wildfires are a big thing for us as well — knowing where they are happening so that we can make sure that our teams are safe and have the ability to check in and check out. Being able to tie all those datasets together in a dashboard of sorts and being able to see where they're at and tie in some of that production information is very valuable. But you can imagine the scale of that project to be able to launch something like that full-bore.

Kelly Kohlleffel (15:09)

Talk to me a little bit about data delivery into your data lake environment and into your data warehouse environments. What are you doing today? What's the diversity of data sources that you have? What are some of the challenges that you have there? What are you doing to address those?

Clinton McFall (15:27)

We are very particular about what ends up in the data lake and what ends up in those isolated data structures that we spoke of. That is something that we have to be very cognizant of. As far as the data lake components go, that's a lot of the data that we use from an operational perspective to really understand what we're doing, how we're doing and be able to maybe do some predictive models on production. 

We have somewhere in the ballpark of 30-40 different data pipelines right now that we're using to really help manage the operation and get those insights internally. We’re kind of dogfooding what we're building so that we can put that in a package model when we see fit to do those things for our clients. For the delivery, obviously, Fivetran is a partner of ours as well. We’ve found a good partnership there and we’re really learning how to use these tools and push them to their limits. 

It's open data sources too: weather, traffic, those fire sources, then our production tracking components. Telematics was one that we mentioned; we’re POCing that. So yeah, it's a maturing process for us.

Kelly Kohlleffel (16:56)

You mentioned AI, ML, GenAI. Anything that you’ve seen this year that’s made you go, “Hey, that might be a really interesting GenAI POC or POV or pilot”? Is there some concrete goal, objective or business problem that you see out there in that AI, ML and GenAI space that maybe you haven't tried yet today, but is on that list for you later on this year?

Clinton McFall (17:24)

I think that's a great question. There are a couple of things that come to mind, but I think the biggest thing is figuring out how we can create something in our product tier for our clients where they can interrogate the data in a new way.

We see a lot of these large language models. I hear a lot of buzz about putting in unstructured data like photos or PDFs and being able to read from that. I think it would be interesting to be able to interrogate a map that way, right? With my GIS background, I wonder: What if you could just ask basic questions about specific geographic regions? or On average, in this zip code, how many poles exist? or How many poles in my projects exist here and what's on those poles? 

What if we could provide that to a client as we're going through the project with them and give them an opportunity to easily ask questions without having to write complex queries or query wizards? I think there might be some value there. 

Kelly Kohlleffel (18:25)

Really interesting. What about changing tracks just a bit? You’ve been leading this technical services team, and there’s a huge data leadership component with all the data that you're dealing with at Davey. What qualities are most important for leading a data team? If there's a specific story or example, that might be really interesting to hear about.

Clinton McFall (18:57)

The first thing that comes to mind is curiosity, right? With the data industry, I think, you really have to understand what dataset you have, what you're working with and what you can do with it. Folks that inherently have curiosity will ask questions. They want to understand those datasets. They want to push it to the limit to understand what they can do with it and what stories they can tell.

Secondarily to that is the attention to detail. Once you understand what you have as far as a dataset, to be able to blend that with other datasets, you’ve got to have that attention to detail. 

I've got a really amazing team of solutions architects. The fact that they are as curious as they are has led to new, innovative solutions that we would have never thought of behind the scenes. Some of that is exactly how we got into a partnership with Fivetran. It’s how we’re able to use the technology to help create isolations of certain datasets so that we can keep the production environment healthy and keep it functioning with minimal impact, but still be able to provide insights to our clients or our operations. 

Being able to generate those reports on a daily basis is powerful, but being able to have it in real-time, that's a different ballgame. Being able to see some of that and make business decisions off of it was the impetus for exploration and saying, “All right, how far can we take real-time information and data, and what are the limitations of it?” 

There are limitations, don't get me wrong, especially when you're blending datasets that are coming in at different intervals. You definitely have to be careful with that, because it can tell a different story if you're not. But yeah, that was the inception of the idea and the kind of curiosity that got us here.

Kelly Kohlleffel (21:00)

What source did your solution architects start when you first chose Fivetran? I assume it was going to BigQuery, but what was that source that made them say, “Hey, this is the one, let's go after it”?

Clinton McFall (21:13)

Really it was trying to get some better insights on our production and understanding what we were doing operationally so that we can provide better insights to our clients. Here's the timeline that we're going to meet. Here's how we're going to do it. 

Kelly Kohlleffel (22:10)

Here you are a few years later. Well, Clinton, this has been great. I have learned a lot. I really appreciate you joining the show today and I look forward to keeping up with everything you're doing at Davey.

Clinton McFall (22:19)

Well, I appreciate being here. It was a great conversation and I'm excited to keep following the cast.

Kelly Kohlleffel (22:25)

Fantastic. Thanks to everybody who listened in today. We appreciate each one of you. 

Expedite insights
Mentioned in the episode
Why Fivetran supports data lakes
How to build a data foundation for generative AI
How to build a data foundation for generative AI
more effective at replicating data for analytics
less data team time to deliver insights

More Episodes

Why everything doesn’t need to be Gen AI
Why everything doesn’t need to be Gen AI
Why everything doesn’t need to be Gen AI
Why everything doesn’t need to be Gen AI
Why everything doesn’t need to be Gen AI
Why everything doesn’t need to be Gen AI