this post was submitted on 29 Jan 2025
615 points (96.7% liked)
Technology
61227 readers
5006 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
OTOH infinite loop detection is a well known coding issue with well known, freely available solutions, so this approach will only affect the lamest implementations of AI,
an infinite loop detector detects when you're going round in circles. They can't detect when you're going down an infinitely deep acyclic graph, because that, by definition doesn't have any loops for it to detect. The best they can do is just have a threshold after which they give up.
You can detect pathpoints that come up repeatedly and avoid pursuing them further, which technically aren't called "infinite loop" detection but I don't know the correct name. The point is that the software isn't a Star Trek robot that starts smoking and bricks itself when it hears something illogical.
It can detect cycles. From a quick look at the demo of this tool it (slowly) generates some garbage text after which it places 10 random links. Each of these links loops to a newly generated page. Thus although generating the same link twice will surely happen. The change that all 10 of the links have already been generated before is small
I would simply add links to a list when visited and never revisit any. And that's just simple web crawler logic, not even AI. Web crawlers that avoid problems like that are beginner/intermediate computer science homework.
They are no loops and repeated links to avoid. Every link leads to a brand new, freshly generated page with another set of brand new, never before seen links. You can go deeper and deeper forever without any loops.
You can limit the visits to a domain. The honeypot doesn't register infinite new domains.
sure, if you have enough memory to store a list of all guids.
It doesn't have to memorize all possible guids, it just has to limit visits to base urls.
what part of "they do not repeat" do you still not get? You can put them in a list, but you won't ever get a hit ic it'd just be wasting memory