this post was submitted on 18 Jun 2023
13 points (100.0% liked)

Chat

7499 readers
6 users here now

Relaxed section for discussion and debate that doesn't fit anywhere else. Whether it's advice, how your week is going, a link that's at the back of your mind, or something like that, it can likely go here.


Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS
 

just a reminder that much of reddit's collective zeitgeist up to sept 2021 is available via chat-gpt, you just need to ask it the right way.

i've gotten best results by saying "what are common ways mentioned on reddit to

you'll need to create an account for chat-gpt - no credit card needed.

aldo the new search function in bing uses chat-gpt on the backend but combines it with bing data for more up to date content.

https://chat.openai.com

you are viewing a single comment's thread
view the rest of the comments
[–] Mallard@beehaw.org 4 points 1 year ago (1 children)

Interesting. I've only dabbled with ChatGPT to check out its creative writing skills, and then later where it failed with any mathematical questions.

Do you feel comfortable/confident in trusting the responses you get from it?

[–] Lazycog@lemmy.one 5 points 1 year ago* (last edited 1 year ago) (1 children)

I usually double check, especially with critical cases. With online searches the most useful part of bing chat has been the references which it adds to the answer which I usually go check out (but the summary is still nice from bing). It's quite clear when it doesn't know the answer, but often the references still help pointing me towards resources!

Overall it is not doing my job for me, but it speeds up the process significantly. Definitely made me feel like my job is still secure because the questions need to be very specific and still require double checking.

Edit: I've been so curious about ChatGPT. Do you notice a significant difference between Bing and ChatGPT itself? I did notice bing does sometimes atleast admit that it does not know. I heard ChatGPT just boldy gives an answer even if it doesn't know it.

[–] Mallard@beehaw.org 6 points 1 year ago (1 children)

I have almost zero experience with bing chat, but I hear good things.

When I used ChatGPT for maths solutions it absolutely chose the "confidently incorrect" style. Even stating things that made no sense or are easy mistakes.. "as 7 is the largest prime", for example.

[–] Lazycog@lemmy.one 3 points 1 year ago

Uh oh! Hmm.. well Bing chat does have three settings for "answer accuracy": creative, balanced and strict. Don't know if ChatGPT has something similar? I usually leave it on balanced or strict and haven't noticed anything weird, but I haven't really tested its limits either. Give it a go!