Theres someone I sometimes encounter in a discord Im in that makes a hobby of doing stuff with them (from what I gather seeing it, they do more with it that just asking them for a prompt and leaving them at that, at least partly because it doesnt generally give them something theyre happy with initially and they end up having to ask the thing to edit specific bits of it in different ways over and over until it does). I dont really understand what exactly it is this entails, as what they seem to most like making it do is code "shaders" for them that create unrecognizable abstract patterns, but they spend a lot of time talking at length about technical parameters of various models and what they like and dont like about them, so I assume the guy must find something enjoyable in it all. That being said, using it as a sort of strange toy isnt really the most useful use case.
Ask Lemmy
A Fediverse community for open-ended, thought provoking questions
Please don't post about US Politics. If you need to do this, try !politicaldiscussion@lemmy.world
Rules: (interactive)
1) Be nice and; have fun
Doxxing, trolling, sealioning, racism, and toxicity are not welcomed in AskLemmy. Remember what your mother said: if you can't say something nice, don't say anything at all. In addition, the site-wide Lemmy.world terms of service also apply here. Please familiarize yourself with them
2) All posts must end with a '?'
This is sort of like Jeopardy. Please phrase all post titles in the form of a proper question ending with ?
3) No spam
Please do not flood the community with nonsense. Actual suspected spammers will be banned on site. No astroturfing.
4) NSFW is okay, within reason
Just remember to tag posts with either a content warning or a [NSFW] tag. Overtly sexual posts are not allowed, please direct them to either !asklemmyafterdark@lemmy.world or !asklemmynsfw@lemmynsfw.com.
NSFW comments should be restricted to posts tagged [NSFW].
5) This is not a support community.
It is not a place for 'how do I?', type questions.
If you have any questions regarding the site itself or would like to report a community, please direct them to Lemmy.world Support or email info@lemmy.world. For other questions check our partnered communities list, or use the search function.
Reminder: The terms of service apply here too.
Partnered Communities:
Logo design credit goes to: tubbadu
I think it’s a fun toy that is being misused and forced into a lot of things it isn’t ready for.
I’m doing a lot with AI but it’s pretty much slop. I use self hosted stable diffusion, Ollama, and whisper for a discord bot, code help, writing assistance, and I pay elevenlabs for TTS so I can talk to it. It’s been pretty useful. It’s all running on an old computer with a 3060. Voice chat is a little slow and has its own problems but it’s all been fun to learn.
I used it the other day to spit out a ~150 line python script. It worked flawlessly on the first try.
I don’t know python.
I've enjoyed some of the absurd things out can come up with. Surreal videos and memes (every president as a bodybuilder wrestler). However it's never been useful and the cost isn't worth the benefit, to me.
It's done a lot of bad/annoying things but I'd be lying if I said it hasn't enabled me to completely sidestep the enshittification of Google. You have to be smart about how you use it but at least you don't have to wade through all the SEO slop to find what you want.
And it's good for weird/niche questions. I used it the other day to find a list of meme songs that have very few/simple instruments so that I could find midi files for them that would translate well when going through Rust's in-game instruments. I seriously doubt I'd find a list like that on Google, even without the enshittification.
it’s useful for programming from time to time. But not for asking open questions. I’ve found having to double check is too unnerving and letting it just provide the links instantly is more my way of working. Other than that it sometimes sketches things out when I have no idea what to do, so all in all it’s a glorified search engine for me.
Other than work I despise writing emails and reports and it fluffs them up. I usually have to edit them afterwards to not make em look ai-made but it adds some „substance“.
If used in the specific niche use cases its trained for, as long as its used as a tool and not a final product. For example, using AI to generate background elements of a complete image. The AI elements aren't the focus, and should be things that shouldn't matter, but it might be better to use an AI element rather than doing a bare minimum element by hand. This might be something like a blurred out environment background behind a peice of hand drawn character art - otherwise it might just be a gradient or solid colour because it isn't important, but having something low-quality is better than having effectively nothing.
In a similar case, for multidisciplinary projects where the artists can't realistically work proficiently in every field required, AI assets may be good enough to meet the minimum requirements to at least complete the project. For example, I do a lot of game modding - I'm proficient with programming, game/level design, and 3D modeling, but not good enough to make dozens of textures and sounds that are up to snuff. I might be able to dedicate time to make a couple of most key resources myself or hire someone, but seeing as this is a non-commercial, non-monitized project I can't buy resources regularly. AI can be a good enough solution to get the project out the door.
In the same way, LLM tools can be good if used as a way to "extend" existing works. Its a generally bad idea to rely entirely on them, but if you use it to polish a sentence you wrote, come up with phrasing ideas, or write your long if-chain for you, then it's a way of improving or speeding up your work.
Basically, AI tools as they are, should be seen as another tool by those in or adjacent to the related profession - another tool in the toolbox rather than a way to replace the human.
To me it's glorified autocomplete. I see LLM as a potencial way of drastically lowering barrier of entry to coding. But I'm at a skill level that coercing a chatbot into writing code is a hiderance. What I need is good documentation and good IDE statical analysis.
I'm still waiting on a good, IDE integrated, local model that would be capable of more that autompleting a line of code. I want it to generate the boiler plate parts of code and get out of my way of solving problems.
What I don't want, is a fucking chatbot.
To me AI is useless. Its not intelligent, its just a blender that blends up tons of results into one hot steaming mug of "knowledge". If you toss a nugget of shit into a smoothie while it's being blended, it's gonna taste like shit. Considering the amount of misinformation on the internet, everything AI spits out is shit.
It is purely derivative, devoid of any true originality with vague facade of intelligence in an attempt to bypass existing copyright law.
It’s really helped me get recipes without website ads overtaxing my old surface.
Boilerplate code (the stuff you usually have to copy anyway from GitHub) and summarising long boring articles. That's the use case for me. Other than that I agree - and having done AI service agent coding myself for fun I can seriously say that I would not trust it to run a business service without a human in the loop
I have horrible spelling and sometimes write in an archaic register. I also often write in a way that sounds rather aggressive which is not my intention most of the time. Ai helps me rewrite that shit and makes me more sensitive to tone in written text.
Of course just like normal spell check and auto completion feature one still needs to read it a final time.
So I'm really bad about remembering to add comments to my code, but since I started using githubs ai code assistant thing in vs code, it will make contextual suggestions when you comment out a line. I've even gone back to stuff I made ages ago, and used it to figure out what the hell I was thinking when I wrote it back then 😆
It's actually really helpful.
I feel like once the tech adoption curve settles down, it will be most useful in cases like that: contextual analysis
I work on a 20+ year knowledge base for a big company that has had no real content management governance for pretty much that whole time.
We knew there was duplicate content in that database, but were talking about thousands of articles, with several more added daily.
With such a small team, identifying duplicate/redundant content was just an ad-hoc thing that could never be tackled as a whole without a huge amount of resources.
AI was able to comb through everything and find hundreds of articles with duplicate/redundant content within a few hours. Now we have a list of articles we can work through and clean up.
I use LLMs for multiple things, and it's useful for things that are easy to validate. E.g. when you're trying to find or learn about something, but don't know the right terminology or keywords to put into a search engine. I also use it for some coding tasks. It works OK for getting customized usage examples for libraries, languages, and frameworks you may not be familiar with (but will sometimes use old APIs or just hallucinate APIs that don't exist). It works OK for things like "translation" tasks; such as converting a MySQL query to a PostGres query. I tried out GitHub CoPilot for a while, but found that it would sometimes introduce subtle bugs that I would initially overlook, so I don't use it anymore. I've had to create some graphics, and am not at all an artist, but was able to use transmission1111, ControlNet, Stable Diffusion, and Gimp to get usable results (an artist would obviously be much better though). RemBG and works pretty well for isolating the subject of an image and removing the background too. Image upsampling, DLSS, DTS Neural X, plant identification apps, the blind-spot warnings in my car, image stabilization, and stuff like that are pretty useful too.
It's great for parsing through the enshittified journalism. You know the classic recipe blog trope? If you ask chatgpt for a recipe, it just gives you one. Whether it's good or not is a different story, but chatgpt is leagues better at getting to the info you want than search has been for the last decade.
I use ChatGPT and Copilot as search engines, particularly for programming concepts or technical documentation. The way I figure, since these AI companies are scraping the internet to train these models, it’s incredibly likely that they’ve picked up some bit of information that Google and DDG won’t surface because SEO.
Its great for documentation like APIs and it really makes a difference
I usually keep abreast of the scene so I'll give a lot of stuff a try. Entertainment wise, making music and images or playing dnd with it is fun but the novelty tends to wear off. Image gen can be useful for personal projects.
Work wise, I mostly use it to do deep dives into things like datasheets and libraries, or doing the boring coding bits. I verify the info and use it in conjunction with regular research but it makes things a lot easier.
Oh, also tts is fun. The actor who played Dumbledore reads me the news and Emma Watson tells me what exercise is next during my workout, although some might frown on using their voices without consent.
That's a bit loaded question. By AI I assume you're refering to GenAI/LLMs rather than AI broadly.
- I use it to correct my spelling on longer posts and I find that it improves the clarity and helps my point come across better.
- I use Dall-E to create pictures I never could have before, because despite my interest in drawing, I just never bothered to learn it myself. GenAI enables me to skip the learning and go straight to creating.
- I like that it can simulate famous people and allows me to ask 'them' questions that I never could in real life. For example, yesterday I spent a good while chatting with 'Sam Harris' about the morality of lying and the edge cases where it might be justified. I find discussions like this genuinely enjoyable and insightful.
- I also like using the voice mode where I can just talk with it. As a non-native english speaker, I find it to be good practise to help me improve my ~~spelling~~ pronunciation.
I went for a routine dental cleaning today and my dentist integrated a specialized AI tool to help identify cavities and estimate the progress of decay. Comparing my x-rays between the raw image and the overlay from the AI, we saw a total of 5 cavities. Without the AI, my dentist would have wanted to fill all of them. With the AI, it was narrowed down to 2 that need attention, and the others are early enough that they can be maintained.
I'm all for these types of specialized AIs, and hope to see even further advances in the future.
I use ChatGpt to ask programming questions, it’s not always correct but neither is Stack Overflow nowadays. At least it will point me in the right direction.
ChatGPT actually explains the code and can answer questions about it and doesn't make snarky comments about how your question is a duplicate of sixteen other posts which kind of intersect to do what you want but not in a clean way.
to copy my own comment from another similar thread:
I’m an idiot with no marketable skills. I put boxes on shelves for a living. I want to be an artist, a musician, a programmer, an author. I am so bad at all of these, and between having a full time job, a significant other, and several neglected hobbies, I don’t have time to learn to get better at something I suck at. So I cheat. If I want art done, I could commission a real artist, or for the cost of one image I could pay for dalle and have as many images as I want (sure, none of them will be quite what I want but they’ll all be at least good). I could hire a programmer, or I could have chatgpt whip up a script for me since I’m already paying for it anyway since I want access to dalle for my art stuff. Since I have chatgpt anyway, I might as well use it to help flesh out my lore for the book I’ll never write. I haven’t found a good solution for music.
I have in my brain a vision for a thing that is so fucking cool (to me), and nobody else can see it. I need to get it out of my brain, and the only way to do that is to actualize it into reality. I don’t have the skills necessary to do it myself, and I don’t have the money to convince anyone else to help me do it. generative AI is the only way I’m going to be able to make this work. Sure, I wish that the creators of the content that were stolen from to train the ai’s were fairly compensated. I’d be ok with my chatgpt subscription cost going up a few dollars if that meant real living artists got paid, I’m poor but I’m not broke.
These are the opinions of an idiot with no marketable skills.
Its really good for all kinds scams.
So...
Low Tier / Wannabe Corpos?
The only things I use and I know they have AI are Spotify recommendations, live captions on videos and DLSS. I don't find generative AI to be interesting, but there's nothing wrong with machine learning itself imo if it's used for things that have purpose.
I hate that it monetized general knowledge that use to be easily searchable then repackaged it as some sort of black box randomizer.
Its funny to fuck around with, in the same way its funny ask a bible bot for Judges 15-16 and watching the bot get autobanned for saying ass.
thats about all it is though, a stupid silly thing to fuck around with.
Shouldnt be a production/human replacement thing.
I've been finding it useful for altering recipes to take my wife's allergies into account. I don't use it for much else. And certainly not for anything important.
For the most part it's not useful, at least not the way people use it most of the time.
It's an engine for producing text that's most like the text it's seen before, or for telling you what text it's seen before is most like the text you just gave it.
When it comes to having a conversation, it can passibly engage in small talk, or present itself as having just skimmed the Wikipedia article on some topic.
This is kinda nifty and I've actually recently found it useful for giving me literally any insignificant mental stimulation to keep me awake while feeding a baby in the middle of the night.
Using it to replace thinking or interaction gives you a substandard result.
Using it as a language interface to something else can give better results.
I've seen it used as an interface to a set of data collection interfaces, where all it needed to know how to do was tell the user what things they could ask about, and then convert their responses into inputs for the API, and show them the resulting chart. Since it wasn't doing anything to actually interpret the data, it never came across as "wrong".