Opensource tools AI vision
Imagine. What if frontend / backend development frameworks and tools were shipped AI'fied by default? So that you can build a product with their tech (whatever it is) with natural language? And it works?
Then, it would make life much easier, right? If you have a bunch of AIs, that are shipped with new definition of done, that includes AI and natural language interaction as a first class citizen, and allows you to achieve the same results but instead of manually typing the code, just input prompts / tickets as a natural language.
So, you would have multiple tools (frameworks, libraries, each with an AI on their atomic level), and an overarching AI, designed for assembling a product, using numerous smaller AIs (each covered with corresponding AI 'unit-tests' coverage reports, evaluation reports, etc).
Think of it like this: each tool has something like a component library like MUI - documentation of what the tool offers, but each component page besides manual use description (previous generation), also has an AI-section, showcasing how to use with a natural language interface, with dynamic examples (hi containers). Like in bolt.new.
-
what open-source tooling could be developed and shared with the community for this?
-
AI docs portal (think of MUI website, + sections for AI examples, showcasing consumption of the component through an AI).
-
Such docs portal would have to be containerized with Ollama in it, open-webui, and a model, pre-trained for this tool's particular components and use cases, APIs.
-
tasks like MMLU to evaluate the tool against usage scenarios for all components / APIs (with a human readable html report for results)
This is a very logical thing. It will happen anyways. Partially it already has begun - with open-webui suggesting to download a container with or without Ollama (depending on whether you have it installed locally)
In fact, the same logic is applicable to every layer in the stack. Techstack. Enterprise runtime stack. Applications. Libraries. Frameworks.