this post was submitted on 26 Aug 2024
103 points (88.7% liked)

No Stupid Questions

35862 readers
1627 users here now

No such thing. Ask away!

!nostupidquestions is a community dedicated to being helpful and answering each others' questions on various topics.

The rules for posting and commenting, besides the rules defined here for lemmy.world, are as follows:

Rules (interactive)


Rule 1- All posts must be legitimate questions. All post titles must include a question.

All posts must be legitimate questions, and all post titles must include a question. Questions that are joke or trolling questions, memes, song lyrics as title, etc. are not allowed here. See Rule 6 for all exceptions.



Rule 2- Your question subject cannot be illegal or NSFW material.

Your question subject cannot be illegal or NSFW material. You will be warned first, banned second.



Rule 3- Do not seek mental, medical and professional help here.

Do not seek mental, medical and professional help here. Breaking this rule will not get you or your post removed, but it will put you at risk, and possibly in danger.



Rule 4- No self promotion or upvote-farming of any kind.

That's it.



Rule 5- No baiting or sealioning or promoting an agenda.

Questions which, instead of being of an innocuous nature, are specifically intended (based on reports and in the opinion of our crack moderation team) to bait users into ideological wars on charged political topics will be removed and the authors warned - or banned - depending on severity.



Rule 6- Regarding META posts and joke questions.

Provided it is about the community itself, you may post non-question posts using the [META] tag on your post title.

On fridays, you are allowed to post meme and troll questions, on the condition that it's in text format only, and conforms with our other rules. These posts MUST include the [NSQ Friday] tag in their title.

If you post a serious question on friday and are looking only for legitimate answers, then please include the [Serious] tag on your post. Irrelevant replies will then be removed by moderators.



Rule 7- You can't intentionally annoy, mock, or harass other members.

If you intentionally annoy, mock, harass, or discriminate against any individual member, you will be removed.

Likewise, if you are a member, sympathiser or a resemblant of a movement that is known to largely hate, mock, discriminate against, and/or want to take lives of a group of people, and you were provably vocal about your hate, then you will be banned on sight.



Rule 8- All comments should try to stay relevant to their parent content.



Rule 9- Reposts from other platforms are not allowed.

Let everyone have their own content.



Rule 10- Majority of bots aren't allowed to participate here.



Credits

Our breathtaking icon was bestowed upon us by @Cevilia!

The greatest banner of all time: by @TheOneWithTheHair!

founded 1 year ago
MODERATORS
 

By "good" I mean code that is written professionally and concisely (and obviously works as intended). Apart from personal interest and understanding what the machine spits out, is there any legit reason anyone should learn advanced coding techniques? Specifically in an engineering perspective?

If not, learning how to write code seems a tad trivial now.

(page 2) 37 comments
sorted by: hot top controversial new old
[–] cley_faye@lemmy.world 2 points 2 months ago (1 children)

For repetitive tasks, it can almost automatically get a first template you write by hand, and extrapolate with multiple variations.

Beyond that… not really. Anything beyond single line completion quickly devolves into either something messy, non working, or worse, working but not as intended. For extremely common cases it will work fine; but extremely common cases are either moved out in shared code, or take less time to write than to "generate" and check.

I've been using code completion/suggestion on the regular, and it had times where I was pleasantly surprised by what it produced, but even for these I had to look after it and fix some things. And while I can't quantify how often it happened, there are a lot of times where it's convincing gibberish.

[–] anytimesoon@feddit.uk 1 points 2 months ago

I've also had some decent luck when using a new/unfamiliar language by asking it to make the code I wrote more idiomatic.

It's been a nice way to learn some tricks I probably wouldn't have bothered with before

[–] Neon@lemmy.world 2 points 2 months ago

The LLM can type the Code, but you need to know what you want / how you want to solve it.

[–] Anticorp@lemmy.world 2 points 2 months ago

Absolutely, but they need a lot of guidance. GitHub CoPilot often writes cleaner code than I do. I'll write the code and then ask it to clean it up for me and DRYify it.

[–] TranquilTurbulence@lemmy.zip 2 points 2 months ago

Yes and no. GPT usually gives me clever solutions I wouldn’t have thought of. Very often GPT also screws up, and I need to fine tune variable names, function parameters and such.

I think the best thing about GPTis that it knows the documentation of every function, so I can ask technical questions. For example, can this function really handle dataframes, or will it internally convert the variable into a matrix and then spit out a dataframe as if nothing happened? Such conversions tend to screw up the data, which explains some strange errors I bump into. You could read all of the documentation to find out, or you could just ask GPT about it. Alternatively, you could show how badly the data got screwed up after a particular function, and GPT would tell that it’s because this function uses matrices internally, even though it looks like it works with dataframes.

I think of GPT as an assistant painter some famous artists had. The artist tells the assistant to paint the boring trees in the background and the rough shape of the main subject. Once that’s done, the artist can work on the fine details, sign the painting, send it to the local king and charge a thousand gold coins.

[–] nikaaa@lemmy.world 1 points 2 months ago (1 children)

my dad uses this LLM python code generation quite routinely, he says the output's mostly fine.

[–] Angry_Autist@lemmy.world 1 points 2 months ago (1 children)

For snippets yes, ask him to tell it to make a complete terminal service and see what happens

[–] Subverb@lemmy.world 3 points 2 months ago (1 children)

I use LLMs for C code - most often when I know full well how to code something but I don't want to spent half a day expressing it and debugging it.

ChatGPT or Copilot will spit out a function or snippet that's usually pretty close to what I want. I patch it up and move on to the tougher problems LLMs can't do.

[–] ImplyingImplications@lemmy.ca -1 points 2 months ago (2 children)

Writing code is probably one of the few things LLMs actually excell at. Few people want to program something nobody has ever done before. Most people are just reimplimenting the same things over and over with small modifications for their use case. If imports of generic code someone else wrote make up 90% of your project, what's the difference in getting an LLM to write 90% of your code?

[–] chknbwl@lemmy.world 1 points 2 months ago

I see where you're coming from, sort of like the phrase "don't reinvent the wheel". However, considering ethics, that doesn't sound far off from plagiarism.

[–] dandi8@fedia.io 1 points 2 months ago

IMO this perspective that we're all just "reimplementing basic CRUD" applications is the reason why so many software projects fail.

load more comments
view more: ‹ prev next ›