And a few seconds latter you get this. Looks like the person using it have at least some idea of what they are doing, so I hope for the best. And might actually end up using this work at some point.
What worked reasonably well for me is “deep research”.
Even when summarizing the content of the sources, it fucks up and says something factually incorrect that the source never stated. But it is pretty good at understanding what you are looking for and giving a list of relevant links to read. Somehow google is worse with that and gives many irrelevant results.
Today for example, I tried using google to find a lithium battery of specific size and capacity and the results were not what I was looking for. Since I didn’t know the technical terms I just explained it in colloquial english “regular smartphone battery, but a footprint of 5x5cm” and it converted it to good search terms “single cell 3.7V lipo 105050/125050/135050” (which I then also used in my own search) and also provided a bunch of links to webshops that had those exact batteries!
I would have never known that I need to search for the conventional numerals that describe it’s size or that there is a difference with the charging controller between single cell and multi cell. (It gave me a source to that claim and I used that to read up on the matter)
Just googling my query only returned random non 5x5cm batteries, some multicell, some videocorder batteries etc. without any link to the difference or explanation of the tech terms or standards.
I hoped LLMs would be able to help me learn things at first, like a patient tutor I can ask all my stupid questions and it’ll never get annoyed with me
It can do that for school level stuff because that material is present in it’s input dataset in a redundant manner. For anything niche or domain-specific, it will hallucinate or fail.
I believe that when the bubble bursts, education will be one genuine usecase for LLMs.
And a few seconds latter you get this. Looks like the person using it have at least some idea of what they are doing, so I hope for the best. And might actually end up using this work at some point.
This is something which I have been saying from a year back, albeit a different form – “I only ask questions to LLMs if I already know the answer”.
They are not supposed to replace coders, but kind of boost their productivity.
This usecase is also quite good.
deleted by creator
What worked reasonably well for me is “deep research”.
Even when summarizing the content of the sources, it fucks up and says something factually incorrect that the source never stated. But it is pretty good at understanding what you are looking for and giving a list of relevant links to read. Somehow google is worse with that and gives many irrelevant results.
Today for example, I tried using google to find a lithium battery of specific size and capacity and the results were not what I was looking for. Since I didn’t know the technical terms I just explained it in colloquial english “regular smartphone battery, but a footprint of 5x5cm” and it converted it to good search terms “single cell 3.7V lipo 105050/125050/135050” (which I then also used in my own search) and also provided a bunch of links to webshops that had those exact batteries!
I would have never known that I need to search for the conventional numerals that describe it’s size or that there is a difference with the charging controller between single cell and multi cell. (It gave me a source to that claim and I used that to read up on the matter)
Just googling my query only returned random non 5x5cm batteries, some multicell, some videocorder batteries etc. without any link to the difference or explanation of the tech terms or standards.
deleted by creator
It can do that for school level stuff because that material is present in it’s input dataset in a redundant manner. For anything niche or domain-specific, it will hallucinate or fail.
I believe that when the bubble bursts, education will be one genuine usecase for LLMs.
deleted by creator
When the bubble bursts, what will survive is what makes money. Education doesn’t make money unless it’s in the national interest.
I’m pretty sure bugs were accepted into the Linux kernel well before the existence of LLMs.
But using a bug generator is a new low.