Executive Business Developer Bill SullivanHis professional career began more than two decades ago with IBM, and he has since held leadership positions at Oracle, Cloudera, and Amazon Web Services, among others. With his expertise in strategy and development, he has worked to drive organizations into their target markets.
In a recent interview with ExecutiveBiz, Sullivan, who is now Vice President and General Manager, Inc dinodoThe federal sector spoke about the challenges facing government agencies in moving to the cloud and adopting AI as well as Dynodo’s expansion into these areas of the federal market.
What are some of the major barriers that remain in the widespread adoption of federal AI, and how do you think we can overcome them?
There are two major barriers, the first being compactness. There are increasingly high and growing barriers to government adoption of new AI applications. Some are essential, such as the security requirements to work for the federal government, but some are contractual. This last group can and should be easily overcome. The government needs to provide an incentive for the system integrator community to add new technology to existing or prospective five-year fixed-price corporate contracts. Right now, grants are often awarded on a fixed-price contract with little incentive for master contract holders to try and add new technology. By adding consideration of new technologies to these contracts, the government can reward prime numbers for introducing new technology as it becomes available.
The second barrier is architectural. The government has data scattered on premises, in mixed clouds and in multiple clouds.
As AI applications provide more accurate and predictable results when they have access to a consistent source of relevant and reliable data drawn from as many different sources as possible, making more data available and locating, accessing, and using data in a timely manner are particular challenges for government. In view of the distributed data architecture. The typical company does not have three or more disparate energy service providers operating in the environment but this is common for the government.
Many of our customers use Denodo because we allow them to access data from multiple clouds in real time, securely, and at scale. The ability to find, index, and make all data available enables their AI applications to perform better testing in a government customer environment, and to bring selected AI applications into production on a faster and more cost-effective basis. The Denodo platform also features some AI/ML functionality for dataset recommendation, collaboration, performance optimization, and DataOps which is highly valuable to our enterprise customers.
What is the biggest challenge you see as federal agencies migrate to the cloud? What solution do you suggest to this problem?
It’s easy to migrate to the cloud. Both cloud computing providers have tools for data ingestion, and an advisory system designed to help customers get data into any of their potential cloud environments. What’s hard to do is access and understand the data you have in a multi-cloud environment to ensure that the data you’re accessing is complete, timely, and accurate for a specific purpose.
The reason Denodo saw revenue growth of 203 percent in the federal business over the past year is because our government customers understand that we allow customers to index their data, and instead of trying to replicate it for applications, production, or data lakes, they can use the data where it lies. By leveraging data virtualization to enable data access, we bring computing to data rather than bringing data to computing.
The second reason we’ve seen this growth is because we’re able to do this securely, at scale, in both classified and unclassified environments, accessing structured and unstructured data. Finally, we’re growing because we’re uniquely helping government by managing permissions to access data to ensure only people who should have access have access, and we can provide an audit of that access.
Our clients include the Missile Defense Agency, the US Army, and ten national laboratories. We handle some of the largest and most complex data in the US government at scale and securely. We also have three of the largest systems integrators as customers, who use us for the same reason: to understand where their data is and access it in real time without having to duplicate it.
One of the things that the government is waking up to on a monthly or quarterly basis is the cost of managing our cloud applications. Clients using Denodo can access their data – and we can cache the data if they have a regularly scheduled query to make sure they’re accessing the latest data – without incurring IO charges for regular queries.
Where do you see expansion opportunities in your company’s portfolio? What new possibilities or markets are you looking forward to?
Government at all levels is moving to the multi-cloud world. We’re seeing an explosion of adoption of DinoDoo across all federal market segments as well as in state and local governments. The government is being pushed or pulled into the multi-cloud environment, and once that’s done, they’re thinking about how to access their data.
One solution is to make a data lake in the cloud. However, it is possible to leave the data where it is and access it in real time, which turns current thinking on its head. Because of this, we see that customers think they don’t need to first transfer data to all clouds and then create a data lake. We can leave data in diverse cloud environments and harness the value that each cloud uniquely brings to the customer.
We can also access that data quickly without having to transfer it to a data lake a second time or incur I/O fees.
We have a dedicated sales team for each federal, state and local government market. We have customized marketing, channels and contractual relationships across Carahsoft. This enables us to serve the government at all levels.
What can you tell us about how federal agencies handle the massive data influx we are seeing today?
Data architecture has historically gone through the phases of centralized and decentralized architecture. We are currently at a juncture where enterprise data remains so distributed it is not likely to come back together, and this requires new approaches to handling such distributed data. The main challenge for the government is not only the huge amount of data distributed across the various clouds, but also the rate at which it is growing, understanding the data they have and using it in a way that makes the data valuable to them. Doing all of this in a timely manner is crucial. Fortunately this is what we do.
We can enable things like JADC2. There will always be some latency, but hopefully no more latency in accessing data to make targeting and defense decisions in a given tactical situation.
This is where DinoDoo really shines. You need unparalleled access to data to generate actionable insights from massive amounts of data, whether structured or unstructured, in a classified environment to use for life and death decisions such as targeting or defense. We can be an essential partner.