Schedule your demo

Just fill out the form below, and we will contact you shortly.

Oops! Something went wrong while submitting the form.

Applied graph theory is our key to unleash network effects

The core of Ulobby's data model is a knowledge graph, that encompasses a representation of categories, properties and relations between the concepts, data and entities within political domains of discourse. The graph structure is organized in a graph database, to allow advanced and deep queries on relationships at many levels, using algorithms known from search engines, recommendation engines and social networks.

As today's digital processes create more and more data,  data silos often inhibit efficient information usage and gapless digital processes. The knowledge graph allows Ulobby to connect existing data sources and systems and create insight and knowledge by semantically enriching information and data flows. Having a connected body of information simply allows us to ask new questions and discover new answers within the existing data. In policy processes the relationships between data points often matter more than the individual points themselves, and by mapping all data including stakeholders and stakeholder relations in a directed mathematical structure Ulobby allows for complex inferences of relevant stakeholders, and their positions in relation to your policy issues. For an effective public affairs organization, the political network is a living asset that is cultivated and developed.

Knowledge extraction from unstructured texts

Modern democracies produce an incomprehensible amount of texts. The news cycle that was once dictated by the limitations of print media distribution is now constant streams on a plethora of channels.

Gaining an overview and being able to respond to current political issues as they unfold requires digital assistance.
Translating texts from prose form to data that machines can process is a science that we cultivate religiously at Ulobby through ongoing research collaborations with universities and GTS institutes.

Reading and decoding political positions and policy documents is difficult even for humans, but recent technological developments within the field of Natural Language Processing has allowed our team to develop and apply a selection of algorithms that make the work much easier eg. topic recognition to discover and follow the agendas relevant to our users to determine, named-entity recognition to determine the stakeholders in a given debate, opinion mining to explore arguments and sentiments within a given political issue.

Get the latest news

Recieve monthly updates on emerging trends and technologies in public affairs.

Success! Check your e-mail for confirmation