this post was submitted on 28 Jun 2023
270 points (98.6% liked)

You Should Know

33116 readers
212 users here now

YSK - for all the things that can make your life easier!

The rules for posting and commenting, besides the rules defined here for lemmy.world, are as follows:

Rules (interactive)


Rule 1- All posts must begin with YSK.

All posts must begin with YSK. If you're a Mastodon user, then include YSK after @youshouldknow. This is a community to share tips and tricks that will help you improve your life.



Rule 2- Your post body text must include the reason "Why" YSK:

**In your post's text body, you must include the reason "Why" YSK: It’s helpful for readability, and informs readers about the importance of the content. **



Rule 3- Do not seek mental, medical and professional help here.

Do not seek mental, medical and professional help here. Breaking this rule will not get you or your post removed, but it will put you at risk, and possibly in danger.



Rule 4- No self promotion or upvote-farming of any kind.

That's it.



Rule 5- No baiting or sealioning or promoting an agenda.

Posts and comments which, instead of being of an innocuous nature, are specifically intended (based on reports and in the opinion of our crack moderation team) to bait users into ideological wars on charged political topics will be removed and the authors warned - or banned - depending on severity.



Rule 6- Regarding non-YSK posts.

Provided it is about the community itself, you may post non-YSK posts using the [META] tag on your post title.



Rule 7- You can't harass or disturb other members.

If you harass or discriminate against any individual member, you will be removed.

If you are a member, sympathizer or a resemblant of a movement that is known to largely hate, mock, discriminate against, and/or want to take lives of a group of people and you were provably vocal about your hate, then you will be banned on sight.

For further explanation, clarification and feedback about this rule, you may follow this link.



Rule 8- All comments should try to stay relevant to their parent content.



Rule 9- Reposts from other platforms are not allowed.

Let everyone have their own content.



Rule 10- The majority of bots aren't allowed to participate here.

Unless included in our Whitelist for Bots, your bot will not be allowed to participate in this community. To have your bot whitelisted, please contact the moderators for a short review.



Partnered Communities:

You can view our partnered communities list by following this link. To partner with our community and be included, you are free to message the moderators or comment on a pinned post.

Community Moderation

For inquiry on becoming a moderator of this community, you may comment on the pinned post of the time, or simply shoot a message to the current moderators.

Credits

Our icon(masterpiece) was made by @clen15!

founded 1 year ago
MODERATORS
 

Why you should know: StackOverflow is facing a mod strike in a similar way as Reddit's mod strike. They are doing this in response to StackOverflow's failure to address it's promises and provide moderation tools

you are viewing a single comment's thread
view the rest of the comments
[–] gonzo0815@sh.itjust.works 9 points 1 year ago* (last edited 1 year ago) (9 children)

Unpopular opinion: for a beginner, ChatGPT gives way better answers than stackoverflow users. The advantage of ChatGPT is that I can command it to dumb it down. Stackoverflow users are used to answer in a language that resembles the language in documentations. They are dry, abstract, lack good examples to the point that the "foobar" shit triggers an immediate defensive reaction in my brain and are phrased for people who already understood a concept but need to refresh their knowledge. Their core problem, as is tradition in any IT field, is that they lack the empathy to understand the viewpoint of someone who understands less of something than they do. It's like asking someone to teach you reading and getting a poem with the advice to just read it as an answer.

I can circumvent that via ChatGPT by asking it to ELI5. Also, I get an answer instantly, am not discouraged to ask further questions and not advised to read a link where a solution is offered in an equally difficult language.

People are saying that using ChatGPT doesn't give accurate information and fails to convey important concepts, but I feel it's actually the other way around. Since there is ChatGPT, I'm making way more progress than before.

I understand that users don't want AI answers, but I also don't get why anyone would want that on this platform. You can just, you know, use AI directly.

[–] veroxii@lemmy.world 12 points 1 year ago (1 children)

That's not unpopular. But there is a problem. ChatGPT can answer your questions mostly because it was trained on the posts and answers of sites like StackOverflow.

If people abandon SO and similar forums then the quality of ChatGPTs answers will go down too.

Especially with something like programming. It's always changing. Next year there will be new versions of C++ and python. There will be new JS frameworks as always. It doesn't stand still.

And without new discussions about new problems, there's nowhere for ChatGPT to learn about them.

[–] gonzo0815@sh.itjust.works 3 points 1 year ago

Haven't though about that, you're right.

[–] InfiniteFlow@lemmy.world 2 points 1 year ago (1 children)

People are saying that using ChatGPT doesn’t give accurate information and fails to convey important concepts

I wish my students would care about the concepts and try to understand the answers instead of just blindly copying and pasting ill-fitting code (and then wondering why it only kinda works...).

As a former student now practicing engineer this habit never gets broken. All of us accept cargo cult computing to one extent or another. It sucks.

Usually the engineers with the least tolerance for it do better but only in the long run. In the short run they are yelled at for holding back projects.

[–] Crackhappy@lemmy.world 2 points 1 year ago (1 children)

I think that one issue with using AI to help you solve programming problems is that sometimes it will wholesale make things up. Of course, people can do that too, which is why communities of coders can vote on the best answer. I say, more power to you, using the tools that work for you. Just be cautious.

[–] pachrist@lemmy.world 1 points 1 year ago

The key with ChatGPT for me has been taken use it as an augmentation, not a gap fill. There's some prerequisite knowledge required on my part. It's a much more useful tool when it's helping flesh out something I know, but have forgotten, or am familiar with, but not proficient. That means I find mistakes faster, and am less prone to having it loop or hallucinate. If I need to ask a question about something where I know very little or nothing at all, I'll peek at a Wikipedia page or something first if I can.

[–] Crackhappy@lemmy.world 2 points 1 year ago (1 children)

I think that one issue with using AI to help you solve programming problems is that sometimes it will wholesale make things up. Of course, people can do that too, which is why communities of coders can vote on the best answer. I say, more power to you, using the tools that work for you. Just be cautious.

[–] JonnyJ@lemmy.world 1 points 1 year ago (1 children)

ChatGPT is incredible for middle ground developers like myself. I understand the goal I'm trying to achieve, and I understand the general process of how to do it. I can ask very granular, specific questions to ChatGPT and it will spit out some code that will get me close to what I need.

If I was a complete novice, I think ChatGPT would make me too dependent on using it for answers.

[–] Crackhappy@lemmy.world 3 points 1 year ago (1 children)

That seems like a totally valid use case. I occasionally will outline some very specific requirements and have AI generate the code, which just saves a lot of time typing, versus it generating it entirely on its own. And I still go through all the code and verify that it's good. It's just a tool that can be used to make your job easier.

[–] JonnyJ@lemmy.world 1 points 1 year ago (1 children)

Totally. The other day I had to test a csv/xls upload tool. I wanted to make sure that no matter what configuration an asshole user had for phone numbers, it would strip everything out so it would be a valid integer for my database.

I told chatgpt to make me a csv with 20 rows, 6 columns with xyz headers, and to give me an assortment of different phone number formats. Took 10 seconds.

[–] david@feddit.uk 1 points 1 year ago

You're storing phone numbers as integers?

[–] Botree@lemmy.world 2 points 1 year ago

Here to echo the same. I thought using AI to assist me in coding would just make me lazy and learn nothing, but turns out I actually learn more than ever since it's much faster, more polite and patient, and the semantics are usually more catered to my needs and self explanatory than the average answers I find elsewhere.

It's great for writing snippets and creating basic frameworks. However, it definitely makes a lot of mistakes which I doubt a total beginner can spot, especially if the error lies in logic and not syntax.

Works great only as a tool for now, but chances are AI will probably surpass human coders sooner than we think.

[–] Stuka@lemmy.world 1 points 1 year ago (1 children)

I played around with ChatGPT for programming for a few hours a while back.

It is far better at explaining code in plain language than pretty much any human I've seen, atleast online. It's absolute dogshit st writing anything but the most basic of code, but it does do a good job explaining.

Programmers are shit at communicating.

[–] sambeastie@lemmy.world 2 points 1 year ago

I've found that it gives me a decent skeleton of something that I can then apply to my actual problem, but not much more, and it usually comes with some pretty big mistakes. I was trying to learn Z80 assembly and it gave me a good idea of how my code should generally look, but I did end up having to rewrite a whole bunch of it before I could actually execute anything.

[–] sulungskwa@lemmy.world 1 points 1 year ago

I think there's a sweet spot for how many other resources are out there. JavaScript GPT answers are pretty good. But when you get to a less popular language like Elixir not so much

[–] Machefi@lemmy.world 1 points 1 year ago

I'm using Bing AI, but that itself uses ChatGPT. The answers are well written, but I feel like it's important to keep in mind that language models, by design, lie often and do it in an extremely plausible way. Use AI all you want, but never rely on its answer without proper fact-checking.

[–] ScreaminOctopus@sh.itjust.works 1 points 1 year ago (1 children)

I've yet to get a useful answer out of chatgpt for a technical question. It's good for fluffing up emails, but I haven't been super impressed with any use case I've tried for it.

[–] CylonBunny@lemmy.world 4 points 1 year ago (1 children)

When I’ve used it for decently complex programming questions I’ve found it often likes to make up functions and libraries. It’ll be like just use this reasonable sounding function from this library, and I look it up and the library does not have that functionality at all. Over and over!

[–] david@feddit.uk 0 points 1 year ago

Well it's a large language model that generates text probabilistically. It's trained on vast amounts of data, so it's expert at sounding like a skilled programmer, but there's absolutely no reason at all for the results to be useful code. It will sound like useful code and look like useful code, and it will be on the right topic, and that might well be enough, but it might not.