levi
This is awesome but I don't really understand.
The purported issue is that they don't have explicit consent for some data points. They apparently responded by saying they were going to charge a subscription.
Why wouldn't they just get consent? I'm sure most fb users will just agree to anything put in front of them.
You're right, this tool isn't designed to address this problem and is ill-suited.
Lemmy should definitely render a static page and then "hydrate" that with JavaScript in the client. This is a common problem with modern js apps. SSR (server side rendering) is the solution but it can be very complex. You really need to build the whole app with SSR in mind, it's not really something to bolt on as an additional feature at the end.
Sadly no, it won't help in that way either (although that wasn't the intent).
The lifecycle of a webpage in a browser is something like:
- download page - includes text content, and links to other resources like format, logic (javascript), and images
- start downloading other resources
- render text, and other resources as they arrive
- start manipulating page with javascript, which in this case includes a final step
- download lemmy post and render that, including the post, comments, and a link to the original post.
When google and others crawl the web looking for data to include in search they generally only index content from step 1, so it would only see the parameters passed to LBS (shown as "declaration" in the demo), and would not see anything rendered by LBS.
This is the "static" nature of static sites. The page downloaded by the browser is not built on request, rather it's only built periodically by the author - usually whenever they have an update or another post. I haven't posted anything on my blog in months so the pages wouldn't have been re-built during that time. There are benefits to this strategy in that it's very secure, very robust, very fast, and very easy to host, but the disadvantage is that any dynamic or "up to date content" (like comments from lemmy) need to be prepared by the client and thus can not be included in step one above and indexed in search.
There is a best of both worlds approach (SSR) where you could render all the comments when a page is originally built, and then update it when the client later renders the page. This means there's at least something for search indexers to see even if it's not up to date. The problem here is that there's a plethora of different engines which people use to build pages and I can't make a plugin for all or even a few of them.
With all that in mind, this is fantastic feedback, and why I posted this pre-alpha demo. Lots of commenters have said the same thing. I can re-factor to at least make SSR easier.
That's not really possible because there's no way to know which instance to direct someone to. No point directing them to an instance where they don't have an account.
Also I don't think showing buttons like upvote which just redirect to another page is a good UX at all.
The short answer is no. It doesn't create a static page from lemmy comments. It loads dynamic lemmy comments into static pages.
a lemmy post along with its comments can become static content on a static web page of your choice
This isn't quite right. The "static web page" (the pre-rendered page) doesn't include the Lemmy post or comments. When your browser renders the static page, the browser will then pull down the lemmy comments.
It's going to continue to work, it's just going to be either paid or ad supported.
Just because a particular service was previously provided for no additional charge, does not mean it has to be so in perpetuity, given that the vast majority of subscribers aren't using the service any longer.
I am ideologically opposed to this form of advertising.
Commercial enterprises can do what they want but I don't think it's at all appropriate for a public institution.
This stinks.