It was some on board gpu with my super amazing AMD K6-2, it couldn't even run mega man X without chugging. Then a friend gave me an S3 Virge with a glorious 4mb vram.
Grofit
I'm sure there is a simple answer and I'm an idiot, but given it's in a place that gets lots of sun, can they not just install solar panels with batteries at consumer/grid level?
Or is the problem not with the generation of the power and with transmitting it to properties? I don't know cost of solar installation but I'm sure the amount it's costing them when it all fails they could at least incentives individuals to install solar or something.
Really enjoying it so far.
I was initially saddened to hear it was going to follow in the steps of 15 and be an action based rpg, and I thought 15 was brain dead "warp strike simulator" with horrible story pacing and poor characters (until last 5% of the game).
This game though has simple but effective action combat with enough variety to be fun and the characters and pacing are a joy.
I still wish we could get some FF games like 7 or 9 where there is depth to equipment, magic and turn based combat, but jrpgs have been iterating away from complex battle systems and sell well so can't see them going back.
I still think FF7 was the pinnacle as material mixing and matching with equipment was really simple and super fun.
Anyway rnat over, FF16 is good, recommend it.
One point that stands out to me is that when you ask it for code it will give you an isolated block of code to do what you want.
In most real world use cases though you are plugging code into larger code bases with design patterns and paradigms throughout that need to be followed.
An experienced dev can take an isolated code block that does X and refactor it into something that fits in with the current code base etc, we already do this daily with Stackoverflow.
An inexperienced dev will just take the code block and try to ram it into the existing code in the easiest way possible without thinking about if the code could use existing dependencies, if its testable etc.
So anyway I don't see a problem with the tool, it's just like using Stackoverflow, but as we have seen businesses and inexperienced devs seem to think it's more than this and can do their job for them.
Are you talking specifically about LLMs or Neural Network style AI in general? Super computers have been doing this sort of stuff for decades without much problem, and tbh the main issue is on training for LLMs inference is pretty computationally cheap
I disagree, there are loads of white papers detailing applications of AI in various industries, here's an example, cba googling more links for you.
I don't mean it's like the dotcom bubble in terms of context, I mean in terms of feel. Dotcom had loads of investors scrambling to "get in on it" many not really understanding why or what it was worth but just wanted quick wins.
This has same feel, a bit like crypto as you say but I would say crypto is very niche in real world applications at the moment whereas AI does have real world usages.
They are not the ones we are being fed in the mainstream like it replacing coders or artists, it can help in those areas but it's just them trying to keep the hype going. Realistically it can be used very well for some medical research and diagnosis scenarios, as it can correlate patterns very easily showing likelyhood of genetic issues.
The game and media industry are very much trialling for voice and image synthesis for improving environmental design (texture synthesis) and providing dynamic voice synthesis based off actors likenesses. We have had peoples likenesses in movies for decades via cgi but it's only really now we can do the same but for voices and this isn't getting into logistics and/or financial where it is also seeing a lot of application.
Its not going to do much for the end consumer outside of the guff you currently use siri or alexa for etc, but inside the industries AI is very useful.
A lot of the AI boom is like the DotCom boom of the Web era. The bubble burst and a lot of companies lost money but the technology is still very much important and relevant to us all.
AI feels a lot like that, it's here to stay, maybe not in th ways investors are touting, but for voice, image, video synthesis/processing it's an amazing tool. It also has lots of applications in biotech, targetting systems, logistics etc.
So I can see the bubble bursting and a lot of money being lost, but that is the point when actually useful applications of the technology will start becoming mainstream.
I love SteamOS for gaming and I think going forward that may get more and more adoption, but a lot of day to day apps or dev tools I use either don't have Linux releases (and can't be run via wine/Proton). I would love to jump over on host rather than dabbling with it via vms/steamdeck but it's just not productive enough.
One especially painful thing is when certain libs I'm developing with need different versions of glibc or gtk to the ones installed by default on OS, and then I die inside.
I just wish we could have less ways to do things in Linux.
I get that's one of the main benefits of the eco system, but it adds too much of a burden on developers and users. A developer can release something for Windows easily, same for Mac, but for Linux is it a flatpak, a deb, snap etc?
Also given how many shells and pluggable infrastructure there is it's not like troubleshooting on windows or mac, where you can Google something and others will have exact same problem. On Linux some may have same problem but most of the time it's a slight variation and there are less users in the pool to begin with.
So a lot of stuff is stacked against you, I would love for it to become more mainstream but to do so I feel it needs to be a bit more like android where we just have a singular way to build/install packages, try and get more people onto a common shell/infrastructure so there are more people in same setup to help each other. Even if it's not technically the best possible setup, if its consistent and easy to build for its going to speed up adoption.
I don't think it's realistically possible but it would greatly help adoption from consumers and developers imo.
Most companies can't even give decent requirements for humans to understand and implement. An AI will just write any old stuff it thinks they want and they won't have any way to really know if it's right etc.
They would have more luck trying to create an AI that takes whimsical ideas and turns them into quantified requirements with acceptance criteria. Once they can do that they may stand a chance of replacing developers, but it's gonna take far more than the simpleton code generators they have at the moment which at best are like bad SO answers you copy and paste then refactor.
This isn't even factoring in automation testers who are programmers, build engineers, devops etc. Can't wait for companies to cry even more about cloud costs when some AI is just lobbing everything into lambdas π
If these were stories I was picking up to implement I would be asking the BA to elaborate some more π