Knowledge graph technology: sharpening data visibility for better decisions
• Knowledge graph technology is a new way of visualizing data across organizations.
• It can help inform and guide stronger business decisions.
• It’s using open-source generative AI to deliver focused data.
Knowledge graph technology is re-writing the way in which objects, people, companies and supply chains can be visualized, examined, and mined for data that can help bring efficiency savings, deal with data reporting requirements and a lot more.
In Part 1 of this article, we sat down with Paul Hopton, CTO at Scoutbee, a leading company offering knowledge graph technology to enterprise clients, to understand how it could be used in supply chains.
But while we had Paul in the chair, we decided to take a deeper dive into knowledge graph technology as it applied to corporate governance and better decision-making.
GenAI delivering knowledge graph technology.
Correct us if we’re wrong here, but you use generative AI to deliver your knowledge graph technology, right?
We do, but it’s not… quite ChatGPT as we know it, Captain.
Intriguing. How so?
Effectively we have our knowledge graph, where we capture all the information which we find, and then we make elements of that graph available to the customer. What’s very important, both in terms of thinking about knowledge graph technology, but also AI in general, is having multi-tenancy support.
ChatGPT is wonderful, but as it stands, it’s not really designed for thinking about enterprise customers. Or at least, the customers we speak to are very nervous about bringing those kinds of things across into their systems.
So we think that actually training our AI specifically on the customer’s data is a big difference. Once you can actually give that insight into what the customer wants to know, what the AI can learn from their customer’s data, you can actually come to much deeper, more business-valuable conclusions, which gives the customer a competitive advantage.
Knowledge graph technology improves data focus.
Ah yes. We’ve spoken to other companies doing different things generative AI, and how it can be used in ways to really boost a company’s productivity, and that seems to be key to all of the standout offerings, that focus on either the area of interest or the company specifically. Training with specific data, rather than training a sort of more generalized AI down into that scenario.
Finding valuable data and drawing business conclusions – priceless.
We’ve built our models on open-source models. They’re smaller, but they don’t need to know anything about TV stars, or mathematicians, or how to bake an apple pie.
They need to know about suppliers, and products, and certifications. They understand geography. They understand things that are pertinent to the task of improving our customers’ knowledge of their own company and their relationships with others.
That still means we’ve been working with 7 billion data point models, and we’re now moving up to some 14 billion point models, which give us much better, much more interesting results. But we don’t need to have the same kind of scale that ChatGPT or Bard will do, because we’re solving a niche problem.
That specialist knowledge is really valuable. And having all that information in the knowledge graph database which the AI can interrogate feels exciting, and has been clearly shown to add value to our customers’ businesses.
Knowledge graph technology and open-source.
Was it that idea of smaller, more focused generative AI models that drew you to open-source? We remember the ripple of terror that went through the big players when it became clear that the open-source community were getting their hands on generative AI models, precisely because they could do more focused, flexible things with significantly less compute and cost.
It’s a story we’ve seen time and again. Things which are supposedly going to change the world, and it’s rarely while they’re monopolized by big companies that it happens. All the innovative stuff is now sitting on open-source systems. Information wants to be free, and it’ll will find a way of becoming free. And that’s what the open-source movement is done. And we had to take advantage of that.
We’re comfortable that we can still build a good business model on top of this. Because what we essentially do is use the AI to give people better access to the information which has already been gathering in their systems, and which they shared with us.
It’s that kind of building up that makes the difference. Here’s the data we found from the internet, let’s use it in our knowledge graph technology solution. Here’s the data which you’ve provided, which enriches the knowledge graph.
Now we’re looking at how we integrate other documents and information that organizations have, to build a much richer AI model for this.
One of the things we talk to our customers about a lot at the moment is the importance of starting to build that out now. If we jump two years into the future, companies that haven’t started engaging with the AI now are going to be having to ask hard questions, and having to answer hard questions from their shareholders.
Knowledge graph technology – norm of the future?
Are we confident then that knowledge graph technology is a norm of the future?
Well… we are, yes. You kind of have to be in it to win it. The people who are working on this now, in two years’ time, will have a very smart, sophisticated AI system, which understands everything that they want to do.
That’s the point with generative AI, isn’t it? It was launched with a bang, and it’s had a contradictory life since then, because on the one hand, it’s been adopted by almost everybody and put it into almost everything.
And on the other hand, it’s had quite a few big players and big scientists come back and ask hard questions about whether we really want to do this, as fast as we’re doing it.
But with the open-source option, firstly, you’re not building anything that can necessarily escape its limited data paradigm, and, as is always the case with open-source, it’s the more people you have working on different elements, the more problems you solve.
Exactly. And I think the capability you have of doing something very destructive is limited when you’re working with a comparatively small open-source model.
Legislation will be necessary at the upper end of the scale, but that’s not really where we are, and the point is, it’s not really where our customers need us to be. They need our models to be focused on their companies, their data points, and their supply chains.
As you say, jettison the apple pie recipes.
Knowledge graph technology – a new way of looking at data.
There are players in the field who’ve seen the advantage of being able to learn incrementally. Knowledge isn’t a finished thing that you can start at the top left and work down to the bottom right. It grows and grows, organically and in different directions.
That’s why companies like LinkedIn have started using knowledge graph technology – a person’s a person, but graphing what that means and understanding that person through their professional life and their interactions with careers, is quite hard to think about.
Putting them in a table, that’s maybe nice for a coding exercise if you’re learning a new programming language, but that’s not what you’re ever going to build a business with.
A person’s a person, no matter how small… but they’re also a data point with several connecting data points.
Exactly. I think our typical supplier is probably around 150 interconnected data points. Not mapped in columns and rows, but as a bunch of connected nodes.
And the AI helps us find relationships and nodes which we didn’t see before. And each new relationship and each new node is a potential unit of added value for the company that has it.
That’s the ongoing power of knowledge graph technology.
28 September 2023
28 September 2023
28 September 2023