Bring LLM to WA Community

This is a tutorial and discussion notes for LLM and KG application within wider WA community

Generative AI is coming and let us start to rethink how it can be applied to our life, our work or our organisation.

However, it is not panacea, and it does have some blocks which stop it from widely spread application.

  • Data hallucination problem.

    • It can not be always consistent with truth

    • And there is no good indicator when it is not telling truth

  • Privacy concern.

    • OpenAI or other public LLM is great

    • However, we have privacy data we do not want to be leaked

  • It is not deterministic.

    • Rule based programming will give your certain results if you have the same input

    • It is not a truth here, for LLM.

    • It is probability based

  • How to make its answer in-context with my own data?

    • Last but not least

    • It can give me some answers, but I do want to know where I have been yesterday

    • How can we integrate it with the private knowledge I have

The first three are the problems, the last one is opportunity.

So to bring LLM to wider community in WA, we will need to first solve or mitigate the first three issues.

There are some promising solutions coming in the research world.

  • Data hallucination problem => RAG: Retrieval Argumented Generation

  • Privacy issue => Self-hosted LLM, or private cloud LLM API

  • Deterministic => AutoGPT

And the last problem, will need to be solved by engineer ways, first we will need to solve the data fusion problem, and try to allow the LLM retrieve the most relevant information and generate proper and in-context answers.

We are trying to work towards that, and bring some examples for wider community to adpot.

If you can not wait to start the hack, test on the self hosted endpoints, check here:

https://uwa-nlp-tlp.gitbook.io/llm-tutorial/bring-llm-to-wa-community/public-open-llm-api

Last updated