Knowledge Driven Prompts

Communication with machines: Digital, Symbolic, Physical


Language has always been an underlying subtext of Artificial intelligence, whether it’s from an engineering (computer languages), communication (user interfaces), or philosophical (truth in knowledge) perspective. While technologies have for long confined AI advances in separate swimlanes, with computing power on front, the spreading of generative technologies like LLMs also induce a convergence with knowledge and communication perspectives due to mounting truthfulness issues. Knowledge is arguably the core issue raised by GAI, an issue that calls for the harnessing of symbolic (typically knowledge graphs) and non-symbolic (typically neural networks) representations. Communication thus appears as a dependent issue that can be summarily considered here.

Knowledge Reference Model

The sudden and anarchic expansion of LLMs has been accompanied by the development of a cottage industry producing the prompts needed to deal with the poor reliability of LLMs outcomes. But most of the cottages rely on fine-tuned scripts that don’t take into account the artificial nature of the interlocutors, and consequently their intrinsic limitations. Yet, at the same time, a general consensus is emerging about the need of adding some knowledge backups to LLMs. Both issues could be worked out if dialogs with LLMs were framed by a knowledge reference model. That can be achieved using ontological prisms to organise prompts into six categories:

Prompts pertaining to the nature of the symbolic resources employed:

  • Factual sources: datasets and/or documents
  • Categories behind the terms employed, grammatical or semantic
  • Concepts, values and intents in support of answers

Prompts pertaining to outcomes and the use of symbolic resources:

  • Facts/concepts: elucidate the meaning of individual items in terms of ideas, values, or intents
  • Facts/categories: elucidate the structure and features of individual items
  • Concepts/categories: how reasoning is supported by structures and concepts

It must be stressed that the alignment of prompts with ontological prisms greatly enhances the interoperability of LLM-based applications with enterprises’ data analytics, business models, and information systems.

A Sample of Basic Prompts

Prompts about Resources

Factual sources

  • Rank documents by date
  • Rank datasets by date
  • Get provenance


  • Find a more general category
  • Find a more specific category


  • Find a metaphor or analogy (concepts)
  • Find a metonymy or synecdoche (concepts)

Prompts about Outcomes


  • Write a definition without circular references
  • Write a summary without circular references

For a comprehensive and principled catalogue see Raphaël Mansuy


  • Find another exemple (fact) for that term (concept)
  • Find a synonym (concept)
  • Find an antonym (concept)


  • Find a more general category to describe this fact
  • Find a more specific category to describe this fact
  • How to you know about this (deduced) fact
  • What do you know (features) about this fact


  • Find a concept for these features (category)
  • Find features (category) for this concept
  • Tell me more about this idea
  • How do you know about this (induced) category





%d bloggers like this: