this post was submitted on 08 Jul 2023
145 points (100.0% liked)

Selfhosted

40767 readers
1010 users here now

A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.

Rules:

  1. Be civil: we're here to support and learn from one another. Insults won't be tolerated. Flame wars are frowned upon.

  2. No spam posting.

  3. Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it's not obvious why your post topic revolves around selfhosting, please include details to make it clear.

  4. Don't duplicate the full text of your blog or github here. Just post the link for folks to click.

  5. Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).

  6. No trolling.

Resources:

Any issues on the community? Report it using the report flag.

Questions? DM the mods!

founded 2 years ago
MODERATORS
 

I put up a vps with nginx and the logs show dodgy requests within minutes, how do you guys deal with these?

Edit: Thanks for the tips everyone!

you are viewing a single comment's thread
view the rest of the comments
[–] Dr_Toofing@programming.dev 5 points 1 year ago (2 children)

These requests are probably made by search/indexing bots. My personal server gets a quite a lot of these, but they rarely use any bandwidth.
The easiest choice (probably disliked by more savvy users) is to just enable cloudflare on your server. It won't block the requests, but will stop anything malicious.
With how advanced modern scraping techniques are there is so much you can do. I am not an expert, so take what I say with a grain of salt.

[–] waspentalive@lemmy.one 2 points 1 year ago

The ligitimate web spiders (for example the crawler used by Google to map the web for search) should pay attention to robots.txt. I think though that that is only valid for web-based services.

[–] rusty@lemmy.world 2 points 1 year ago (1 children)

Fail2Ban is great and all, but Cloudflare provides such an amazing layer of protection with so little effort that it's probably the best choice for most people.

You press a few buttons and have a CDN, bot attack protection, DDOS protection, captcha for weird connections, email forwarding, static website hosting... It's suspicious just how much stuff you get for free tbh.

[–] AES@lemmy.ronsmans.eu 7 points 1 year ago (1 children)

And you only need to give them your unencrypted data...

[–] GlitzyArmrest@lemmy.world 1 points 1 year ago* (last edited 1 year ago) (1 children)

To be fair, you can configure Cloudflare to use your own certs.

[–] sifrmoja@mastodon.social 1 points 1 year ago (1 children)

@GlitzyArmrest Including for origins? If not, the point of CloudFlare is gone.

[–] ItsGhost@sh.itjust.works 3 points 1 year ago

You can use a custom origin certificate, but that’s irrelevant when CloudFlare still re-encrypt everything to analyse the request in more detail. It does leave me torn when using it, I don’t use it on anything where sensitive plain text is flying around, especially authentication data (which is annoying when that’s the most valuable place to have the protection), but I do have it on my matrix homeserver as anything remotely important is E2EE anyway so there’s little they can gain, and with the amount of requests it gets some level of mitigation is desirable