Building an AI agent inside a 7-year-old Rails monolith
catalinionescu.dev - 36 poäng - 5 kommentarer - 8868 sekunder sedan
Kommentarer (5)
- pell - 3360 sekunder sedanWas there any concern about giving the LLM access to this return data? Reading your article I wondered if there could be an approach that limits the LLM to running the function calls without ever seeing the output itself fully, e.g., only seeing the start of a JSON string with a status like “success” or “not found”. But I guess it would be complicated to have a continuous conversation that way.
- sidd22 - 2795 sekunder sedanHey, interesting read. I am working on product in Agent <> Tool layer. Would you be open for a quick chat ?
- tovej - 942 sekunder sedanIf all this does is give you the data from a contact API, why not just let the users directly interact with the API? The LLM is just extra bloat in this case.
Surely a fuzzy search by name or some other field is a much better UI for this.
- - 8868 sekunder sedan
- MangoToupe - 3200 sekunder sedanBruh this cannot seriously be considered interesting by hacker news guidelines. Where's the beef? Can i submit my instagram client for points next?
Nördnytt! 🤓