Harnessing LLMs for Data Analysis
When we think of LLMs (large language models), usually what comes to mind are general purpose chatbots like ChatGPT or code assistants like GitHub Copilot. But as useful as ChatGPT and Copilot are, LLMs have so much more to offer—if you know how to code. In this demo, Joe Cheng will explain LLM APIs from zero, and have you building and deploying custom LLM-empowered data workflows and apps in no time.
Posit PBC hosts these Workflow Demos the last Wednesday of every month. To join us for future events, you can register here: https://posit.co/events/
Slides: https://jcheng5.github.io/workflow-demo/
Resources shared during the demo:
Environment variable management:
- For R: https://docs.posit.co/ide/user/ide/guide/environments/r/managing-r.html#renviron
- For Python https://pypi.org/project/python-dotenv/
Shiny chatbot UI:
- For R, Shinychat https://posit-dev.github.io/shinychat/
- For Python, ui.Chat https://shiny.posit.co/py/docs/genai-inspiration.html
Deployment:
- Enterprise Solution: https://posit.co/products/enterprise/connect/
- Cloud hosting https://connect.posit.cloud
- Open source: https://posit.co/products/open-source/shiny-server/
Querychat:
If you have specific follow-up questions about our professional products, you can schedule time to chat with our team: pos.it/llm-demo