[-] rrobin@lemmy.world 3 points 5 days ago* (last edited 5 days ago)

Just pilling on some concrete examples, awesome-gemini is definitely the best place to start looking. There are both converters for the gemtext format and gateways for the protocols.

For format conversion tools, awesome-gemini already lists a handful of tools.

From the gemini side there are some gateways for specific websites operated by various people

  • BBC news gemini://freeshell.de/news/bbc.gmi
  • The Guardian gemini://guardian.shit.cx/world/
  • Lots of others gemini://gemi.dev/cgi-bin/waffle.cgi

These work pretty well for me. I think there were public gateways to open http pages from gemini, but I can't recall one from the top of my head.

Some of the gemini browsers support gemini proxies to access http(s) content. You can run it in your own machine. Duckling is the only one I'm familiar (but see the awesome list for more)

Conversely, to access gemini pages from a web browser portal.mozz.us hosts a gateway (just place whatever gemini link you want in the box).

One big privacy caveat of using gemini proxies for this is that while this may improve your privacy with regards to javascript/cookies it will reduced it because it makes your behaviour more identifiable from the point of view of the websites you visit (i.e. your proxy is clearly not a browser making it unusual).

[-] rrobin@lemmy.world 9 points 3 months ago

Depends on what you mean by "secure", being very loose with the definitions, we have

  • end to end confidentiality (i.e. only you and the intended destination can see the message contents)
  • privacy (only the destination knows i'm sending messages to them)
  • anonymity (no one can find out who you are, where you live, i.e. metadata/identity/etc)

My personal preference is Simplex.

Reasoning for a few:

  • Email: even if you use PGP to encrypt messages the server(s) in the delivery path have access to all metadata (sender, receiver, etc, etc). If no encryption is in use, they see everything. Encryption protocols in e-mail only protect the communication between client and server (or hop by hop for server to server)
  • XMPP: similar reasoning to email. i.e. the server knows what you send to who. I should note that XMPP has more options for confidentiality of message content (PGP, OMEMO, others). So I find it preferable to email - but architecturally not too different.
  • IRC: Again similar reasoning to email - even if your IRC server supports TLS, there is no end to end encryption to protect message contents. There were some solutions for message encryption/signing, but I've never seen them in the wild.
  • Signal: Good protocol (privacy, confidentiality, etc). Dependency on phone number is a privacy concern for me. I think there are 3rd party servers/apps without the use of phone numbers.
  • Simplex: Probably the strongest privacy protection you can find, but definitely not easy in terms of usability. The assumption is that we do not trust the intermediate server at all (and expose nothing to it), we just leave our encrypted messages there for the receiver to pick up later. It also does some funny stuff like padding messages with garbage.
  • Matrix: In theory it supports end to end encryption in various scenarios, but my experience with it has been so bad (UX, broken encrypted sessions) I only use it for public groups.

Some more food for though though; these protocols support both group communication and 1-1 messaging - privacy expectations for these two are very different. For example I don't care too much about confidentiality in a group chat if there are 3000 people in there. It might be more concerned with concealing my phone/name/metadata.

In general I consider large group chats "public", I can try to be anonymous, but have no other expectations. e.g. some people use some protocols over ToR because they do not trust the service (or even the destination) but they try to protect their anonymity.

On a technical note: I don't think there is any protocol that supports multi-device without some kind of vulnerability in the past. So I would temper my expectations if using these protocols across devices.

I'm not familiar with the other ones that were mentioned in comments or in the spreadsheet.

[-] rrobin@lemmy.world 4 points 4 months ago

There are gemini to http gateways so the content is probably already crawled anyway.

[-] rrobin@lemmy.world 11 points 4 months ago

So lets be clear - there is no way to prevent others from crawling your website if they really want to (AI or non AI).

Sure you can put up a robots.txt or reject certain user agents (if you self host) to try and screen the most common crawlers. But as far as your hosting is concerned the crawler for AI is not too different from e.g. the crawler from google that takes piece of content to show on results. You can put a captcha or equivalent to screen non-humans, but this does not work that well and might also prevent search engines from finding your site (which i don't know if you want?).

I don't have a solution for the AI problem, as for the "greed" problem, I think most of us poor folks do one of the following:

  • github pages (if you don't like github then codeberg or one of the other software forges that host pages)
  • self host your own http server if its not too much of an hassle
  • (make backups, yes always backups)

Now for the AI problem, there are no good solutions, but there are funny ones:

  • write stories that seem plausible but hold high jinx in there - if there ever was a good reason for being creative it is "I hope AI crawls my story and the night time news reports that the army is now using trained squirrels as paratroopers"
  • double speak - if it works for fictional fascist states it works for AI too - replace all uses of word/expression with another, your readers might be slightly confused but such is life
  • turn off your web site at certain times of the day, just show a message showing that it only works outside of US work hours or something

I should point out that none of this will make you famous or raise your SEO rank in search results.

PS: can you share your site, now i'm curious about the stories

[-] rrobin@lemmy.world 11 points 8 months ago

I don't quite agree with some of the rationale

  1. I do think users have benefited from Open Source, but I also think that there has been an a decline in Open Source software in general
  2. I don't think contracts are a good analogy here (in the sense that every corporate consumer of the software would have to sign one)

Having said this I do understand where he is coming from. And I agree that:

  1. a lot of big companies consume this software and don't give back
  2. corporate interests are well entrenched in some Open Source projects, and some bad decisions have been made
  3. he does raise an interesting point about the commons clause (but them I'm no laywer)

I would like to remind everyone that the GPL pretty much exists because of (1.). If anything we should have more GPL code. In that regard I don't think it failed us. But we rarely see enforced (in court). Frankly most of our code is not that special so please GPL it.

Finally I think users do know about Open Source software indirectly. In the same way they find out their "public" infrastructure has been running without permit or inspection the day things start breaking and the original builder/supplier is long gone and left no trace of how it works.

Since these days everything is software (or black box hardware with firmware) this is increasingly important in public policy. And I do wish we would see public contracts asking for hardware/firmware what some already for software.

I wont get into the Redhat/IBM+CentOS/Fedora or AI points because there is a lot more going on there. Not that he is not right. But I'm kind of fed up with it :D

[-] rrobin@lemmy.world 7 points 10 months ago

I've tried a few times in the past 2 weeks. Using a good email account and also with github, no luck though. Maybe its doing some "smart" heuristics to trigger it.

I just retried now, using that temp mail (but no vpn) and got the exact same phone verification. Maybe my IP address is evil :D

429
submitted 10 months ago by rrobin@lemmy.world to c/privacy@lemmy.ml

Looks like gitlab now requires account verification for new accounts in addition to email. Either phone number or credit card.

This applies both to accounts created with a working email or by logging in using your github account. You can't even verify your email until you go through step 1.

I don't know when this started, but at least for the last month or two judging from these posts in the forums.

Fun fact: I don't even want to host on gitlab, I just wanted to report bugs in some projects. So I'm locked out.

[-] rrobin@lemmy.world 9 points 11 months ago

I'm a bit of terminal nerd, so probably not the best person to talk about desktop. I don't have many thoughts with regards to app development or layout for accessibility. What I really would like is for distros to be accessible from the ground up, even before the desktop is up.

The best example of accessibility from the ground up I saw for linux was talking arch, an Arch Linux spin with speech. Sadly the website is gone, but we can find it in the web archive

in particular there was an audio tutorial to help you install the live cd (you can still ear it in the archive):

Here are a few resources, which are pretty dated but I wish they were the norm in any install:

Now going into your points:

How should a blind Desktop be structured?

To be honest I don't expect much here. As long as context/window switching signals you properly you are probably fine. I have not used gnome with orca in a long time, but this used to be ok. The problems begin with the apps, tabs and app internal structure.

Are there any big dealbreakers like Wayland, TTS engines, specific applications e.g.?

Lots.

Some times your screen reader breaks and its nice to have a magic key that restarts the screen reader, or the entire desktop. Or you just swap into a virtual console running speakup/yasr and do it yourself :D

TTS engines are probably ok. Some times people complain about the voices, but I think it is fine as long as it reliably works, does not hang, responds quickly.

Specific applications are tricky. The default settings on a lot of apps wont work well by default, but that is not surprising.

I do think that a lot of newer apps have two problems

  1. they are not configurable or scriptable at all, there is only one way to do things and no way to customize it. Opening tickets to patch each and every feature is not feasible.
  2. They frequently go through breaking release cycles that nuke old features, so you need to relearn all your tricks on the next major release and find new hacks

I can give you two good-ish examples, both Vim and Mutt can work very well with a terminal screen reader, but it is a lot of work to configure:

  • with vim you need to disable all features that make the cursor jump around and draw stuff (like line numbers and the ruler)
  • with mutt every single string in the screen can be customized, so you even insert SSML to control speech and read email

I think you can find similar examples in desktop apps too.

What do you think would be the best base Desktop to build such a setup on?

no idea to be honest. Gnome use to have support. I suppose other desktops that can be remote controlled could be changed to integrate speech (like i3 or sway).

Would you think an immutable, out of the box Distro like “Fedora Silversound”, with everything included, the best tools, presets, easy setup e.g. is a good idea?

I have never used Silversound. But the key thing for me is to be able to roll back forward to a working state.

How privacy-friendly can a usable blind Desktop be?

I think it should be fine. People with screens have things like those Laptop Screen Privacy Filter, people using audio have headphones. Depending on your machine you can setup the mixer so that audio never uses the external speaker.

I don't recall the details but you can also have some applications send audio to the external speaker while others use your headphones (provided they are a separate sound card, like usb/bluetooth headphones).

Also, how would you like to call it? “A Talking Desktop”?

Urgh, Shouting Linux.

[-] rrobin@lemmy.world 33 points 11 months ago

This is a really nice summary of the practical issues surrounding this.

There is one more that I would like to call out: how does this client scanning code end up running in your phone? i.e. who pushes it there and keeps it up to date (and by consequence the database).

I can think of a few options:

  1. The messaging app owner includes this as part of their code, and for every msg/image/etc checks before send (/receive?)
  2. The phone OS vendor puts it there, bakes it as part of the image store/retrieval API - in a sense it works more on your gallery than your messaging app
  3. The phone vendor puts it there, just like they already do for their branded apps.
  4. Your mobile operator puts it there, just like they already do for their stuff

Each of these has its own problems/challenges. How to compel them to insert this (ahem "backdoor"), and the different risks with each of them.

[-] rrobin@lemmy.world 12 points 1 year ago

Fair point (IP, email, browser session data). Those should not be exposed via the federation in any way. And the existence of the federated network means we could switch instances if we are concerned our instance is a bad actor about this.

I did not mean to suggest the ecosystem is not valuable for privacy. I just really don't want people to associate federation with privacy protections about data that is basically public (posts, profile data, etc). Wrong expectations about privacy are harmful.

[-] rrobin@lemmy.world 34 points 1 year ago

To be fair I do not expect any privacy protections from lemmy/mastodon in general, or from blocking/defederation in particular.

Lemmy/Mastodon protocols are not really private, as soon you place your data in one instance your data is accessible by others in the same instance. If that instance is federated this extends to other instances too. In other words the system can be seen as mostly public data since most instances are public.

The purpose of blocking or defederation (which is blocking at instance level) is to fight spam content, not to provide privacy.

[-] rrobin@lemmy.world 6 points 1 year ago

They could serve similar purposes. In terms of maturity nostr is younger. Here are the main differences from the point of view of nostr:

  • In nostr there is no registration, your identity is your public key that you generate by yourself (lose that and you cannot recover it). You can connect to a bunch of different nostr relays with the same key, or use different ones.
  • AFAIK nostr does NOT do end to end encrypted for group chat. But it does support end to end encryption for direct messages
  • nostr does not do video/audio calls
  • nostr does not host your images/files, you just put some URL in your messages

At its core nostr is a basic protocol where you send messages to a relay server and the relay passes them along to other people when they request them. And on top of those messages people implement extensions for features, full length posts, payments, etc. The are notions of followers and subscriptions (like twitter) but those are just tiny messages where you ask the relay for messages from person A or B. The list of specifications is here https://github.com/nostr-protocol/nips

Finally there are a few different nostr implementations for relays, clients and web interfaces. Some of them do not implement all the features, so you may need to shop around a bit if your are looking for some fancy features (check https://github.com/vishalxl/Nostr-Clients-Features-List).

Also some nostr highlights which I think don't have equivalent in matrix (but deserve nerd points)

  • message expiration dates - the relay removes them after the deadline
  • nostr has builtin proof of work to dissuade spam by forcing the client to do some computation before posting
  • you can do reposts across relays or share relay addresses to people in another relay
[-] rrobin@lemmy.world 9 points 1 year ago

As any engineer who does ops can tell you - you did the right thing - the solution is always to roll back, never force a roll forward, ever.

We should totally do pre and post update parties though. Even if the update fails we can have an excuse for drinks and a fun thread.

view more: next ›

rrobin

joined 1 year ago