[-] pyrex@awful.systems 15 points 2 months ago* (last edited 2 months ago)

I do not recommend using the word "AI" as if it refers to a single thing that encompasses all possible systems incorporating AI techniques. LLM guys don't distinguish between things that could actually be built and "throwing an LLM at the problem" -- you're treating their lack-of-differentiation as valid and feeding them hype.

[-] pyrex@awful.systems 15 points 2 months ago

I mean, if no one's getting paid, then my preferred price is $0, to everyone in the world.

[-] pyrex@awful.systems 32 points 2 months ago

You're right! There's a disclosure on the page but it's fuckin tiny.

115
[-] pyrex@awful.systems 11 points 2 months ago
  • high willingness to accept painfully inexact responses
  • high tendency to side with authority when given no information
  • low ability to distinguish "how it is" from "how it seems like it should be"

Meta:

  • default expectation that others are the same way
  • indignant consent-ignoring gesture if they're not
132
submitted 2 months ago* (last edited 2 months ago) by pyrex@awful.systems to c/techtakes@awful.systems

The machines, now inaccessible, are arguably more secure than before.

[-] pyrex@awful.systems 8 points 2 months ago

It's the technique of running a primary search against some other system, then feeding an LLM the top ~25 or so documents and asking it for the specific answer.

[-] pyrex@awful.systems 19 points 2 months ago* (last edited 2 months ago)

The media again builds a virtual public consisting of billionaires of a variety of positions and ask you "which one do you agree with?" This is a strategy to push the public closer to the beliefs of billionaires.

I don't know who these fucking people are. The real public in California still supports Biden by a 25% margin.

[-] pyrex@awful.systems 10 points 2 months ago* (last edited 2 months ago)

I don't understand why people take him at face value when he claims he's always been a Democrat up until now. He's historically made large contributions to candidates from both parties, but generally more Republicans than Democrats, and also Republican PACs like Protect American Jobs. Here is his personal record.

Since 2023, he picked up and donated ~$20,000,000 to Fairshake, a crypto PAC which predominantly funds candidates running against Democrats.

Has he moved right? Sure. Was he ever left? No, this is the voting record of someone who wants to buy power from candidates belonging to both parties. If it implies anything, it implies he currently finds Republicans to be corruptible.

[-] pyrex@awful.systems 10 points 4 months ago

This is another one for the "throw an AI model at the problem with no concrete plans for how to evaluate its performance" category.

[-] pyrex@awful.systems 12 points 4 months ago

It sounds like ChatGPT is eligible for a degree in business!

[-] pyrex@awful.systems 13 points 5 months ago

Show HN: I'm 16 and building an AI based startup called Factful with friends

In which the Orange Site is a very bad influence on some minors:

How do you evaluate “factuality” without knowing all the facts, though? That’s the downfall of all such services - eventually (or even immediately) they begin to just push their preferred agenda because it’s easier and more profitable.

Hi there, thank you for your feedback! I think we could potentially go down the route of a web3 approach where we get the public consensus on the facts.

...

Your first meta-problem to solve is to get people to care about the facts, and to accept them when they’re wrong. There is an astonishing gap between knowing the truth and acting accordingly.

Yea, that's why we also added in an grammar checker, even if they dont care about facts, they can get something better than gram marly that checks for way more for way less.

[-] pyrex@awful.systems 8 points 5 months ago* (last edited 5 months ago)

My opinion is that Jesse Lyu is lying about making any significant changes. (Because otherwise the demo wouldn't have worked)

I don't want bad things for him personally, but I want bad things to happen to people who lie in public.

The code is open source with licensing requirements, so I'm therefore hoping someone Jesse has already made a statement to can write him with these requests:

  • For GPL2 licensed components such as Linux: Give me your changes in source form.
  • For Apache-licensed components such as Android: What files did you change?

I can imagine him responding in three ways:

  • "Sure, here is another lie" -- and then he's locked into an answer which will probably make him look clueless as hell
  • "We don't think we have to do that" -- and now the Open Source Reply Guy Brigade instantly hates him.
  • -- and now, given that a conversation has actually occurred, he looks evasive.
[-] pyrex@awful.systems 12 points 5 months ago

Every rationalist I've met has been nice and smart and deserved better. These are nerds and not in a bad way, but in a way that gets them bullied and shamed and gaslit. And in practice I can come to agreement with them on lots of issues.

On this issue I can never pin them down -- responding with what I think are reasonable questions gets me shut down with what I believe is thought-stopping behavior. They rarely state the actual reasons and the actual reasons are always slippier when they have to verbalize them to people who don't agree.

No doubt if you're a cynical manipulator, "having your followers lie about what you believe" works for you. But a lot of these are going to be nice normal guys who are tired of being laughed at and, worryingly, tired of being made to think.

In this respect they have a lot in common with, say, high school kids who became communists in part to piss off their parents. I'm not saying that to mock those kids, because I was one of them -- and I think there's a huge part of this that they're not wrong about and they're entitled to demand to be taken seriously, and precious few people do take nerds seriously. And for that matter, there's philosophically sophisticated people doing the same work as them.

I don't know how we get them into spaces where something is actually done -- if not for humanity or whatever, for people very close to them who actually need it -- and where the seduction of ego and money isn't like, so readily and constantly available.

view more: next ›

pyrex

joined 5 months ago