My Thoughts on AI

The Gist

I really don’t feel like AI belongs in any production environment. There has been no case of an AI generating something that I would be comfortable shipping with. Text models lack the human element that makes them feel real, repeat many phrases or words in a very robotic manner. Image models require too much training on copywritten material to be feasible, most of the time it’d be much easier to just do the art or take the picture you want. I find AI in its current state, better as a toy, it shouldn’t be taken seriously, and isn’t worth spending money on.

Do I use any for my content?

I’ve used local LLMs for generating the “Summary” content of pages that you find on the indexes of pages, but not the page content itself. Most of the time it’s not even good enough for that, but it was nice when transferring over many pages to this format. Most pages currently have had that text re-written to better explain the page, and any new pages are completely hand done, which I want to learn to get better at, because I feel like I’m fairly dry and boring with those summaries.

The problems

Certainly not all the problems, but the ones that matter the most to me.

Stunted learning opportunities

For me, learning is the goal in any project I work on, and I feel like many AI “solutions” completely circumvent having to learn the skill they bludgeon to death in their outputs. This is something I regularly experienced when trying out image generation models, where something is just “a little off” in their outputs, or there’s some kind of style I want to enforce that isn’t based on some existing artist’s style. Having to regenerate, inpaint, or whatever to adjust styles takes so much time that I know that it would’ve been more efficient and easier to just learn how to do the art myself.

Training data

I alluded to this before, but due to image generation models being trained on art without the artist’s consent, it makes their output dubious at best. I wouldn’t feel comfortable using any image generation’s output in a production setting.

Unjustified confidence

For text based models, output tends to be very very confident about its output. As humans, we’re susceptible to anyone or thing being that confident about themselves and just take things at face value. This can cause catastrophic problems as we go on, taking in incorrect information as fact. We used to live in a world where we could simply Google to verify if our sources were correct, but now that Google integrates AI into its results, it’s just as shitty.