this post was submitted on 15 Jun 2024
2 points (100.0% liked)

Privacy

31816 readers
292 users here now

A place to discuss privacy and freedom in the digital world.

Privacy has become a very important issue in modern society, with companies and governments constantly abusing their power, more and more people are waking up to the importance of digital privacy.

In this community everyone is welcome to post links and discuss topics related to privacy.

Some Rules

Related communities

Chat rooms

much thanks to @gary_host_laptop for the logo design :)

founded 5 years ago
MODERATORS
top 11 comments
sorted by: hot top controversial new old
[–] WalnutLum@lemmy.ml 1 points 4 months ago

There are VERY FEW fully open LLMs. Most are the equivalent of source-available in licensing and at best, they're only partially open source because they provide you with the pretrained model.

To be fully open source they need to publish both the model and the training data. The importance is being "fully reproducible" in order to make the model trustworthy.

In that vein there's at least one project that's turning out great so far:

https://www.llm360.ai/

[–] umami_wasbi@lemmy.ml 1 points 4 months ago* (last edited 4 months ago) (1 children)

What's FOSS-AI? A model everyone can download and use for free? Or in the OSS spirit that everything need to be open and without discrimination of use, aka OSS training data corpus and no AUP attached?

Or you mean the inference engine running those models?

[–] Peter_Arbeitslos@discuss.tchncs.de 0 points 4 months ago (1 children)

Everything which is not BigTech. Preferably FOSS, at least not BigTech, just alternatives to for example OpenAI.

[–] umami_wasbi@lemmy.ml 1 points 4 months ago* (last edited 4 months ago)

So you're including free models like freeware, not FOSS only, by non big tech.

Your choice of models will be quite limited as the compute resource and training corpus needed to make a viable base model isn't anyone can do.

[–] thayer@lemmy.ca 0 points 4 months ago* (last edited 4 months ago) (3 children)

If a layman may ask, what are folks even using AI/LLMs for mostly? Aside from playing around with some for 10-15 mins out of simple curiosity, I don't have a practical use for platforms like ChatGPT. I'm just wondering what the average tech enthusiast uses these for, outside of academia.

[–] snek_boi@lemmy.ml 1 points 4 months ago (1 children)

A friend of mine and I have gotten used to using it during our conversations. We do fast fact-checking or find a good first opinion regarding silly topics. We often find it faster than digging through search-engine results and interpreting scattered information. We have used it for thought experiments, intuitive or ELI5 explanations of topics that we don’t really know about, finding peer-reviewed sources for whatever it is that we’re interested in, or asking questions that operationalizing into effective search engine prompts would be harder than asking with natural language. We always always ask for citations and links, so that we can discard hallucinations.

[–] thayer@lemmy.ca 1 points 4 months ago* (last edited 4 months ago)

Thanks for sharing! I'm probably too set in my ways to ever utilize AI for things like this. I never use virtual assistants like Alexa or Google either, as I like to vet and interpret the source of information myself. Having the citations would be handy, but ultimately I'd want to read them myself so the IA/VA just becomes an added step.

[–] Eggyhead@kbin.run 1 points 4 months ago (1 children)

I teach language. I get paid for my time in front of students, not the time it takes to prepare their lessons and the materials. I use AI to quickly reference grammar rules, to fabricate example dialogs in specific scenarios to practice, and to suggest activities to do in class to practice the target grammar. I never do exactly as it says, just take it as kind of a source of suggestions for me to build from.

[–] thayer@lemmy.ca 1 points 4 months ago* (last edited 4 months ago) (1 children)

That sounds like a time saver for sure. I imagine that some of those elements (grammar rules) are widely available everywhere, while others (practice dialogues, activity suggestions focused on the use of language) would require a fairly specific training model.

[–] Eggyhead@kbin.run 1 points 4 months ago

Well, LLMs are quite literally trained on language, so asking it to simulate a conversation between a hotel clerk and a guest who is upset that they can’t find the hair dryer is pretty much what it’s best at doing.

You can even build the dialogs with students. Have them introduce a scenario for the LLM to manufacture, then have the students suggest variables to apply, such as the clerk being hungry and in a bad mood while the guest is actually drunk after returning from a club in order to see how the language changes, then have the students act it out for laughs.

[–] ErwinLottemann@feddit.de 0 points 4 months ago

we use it to classify data that is needed to be sent to one of three endpoints. chatgpt tells our tool where it belongs. there are.probably more practical ways to do this, but the customer wanted AI in his product so here we are 🤷