25 things ChatGPT isn't great at
Like most developers, I spend a lot of time in ChatGPT these days. It's excellent for helping me write code that I typically wouldn't write (nor want to). Things like complex regex or SQL expressions come to mind.
It's saved me a ton of time in the past few months and I absolutely would not want to go back to living without it. But has it meaningfully transformed anything yet? I would have to say no, not really. Useful? Yes. Transformative? Not at this point, not for me.
I wanted to publish a take on this that I could reflect back on in years to come. Will this all be completely wrong? We'll see.o
ChatGPT is great with SQL and regex, but what is it not great with? Here's a quick list of 25 things. (Note that these are short summaries. In each instance I adjusted the prompt many, many times to get better responses.)
At home
- Cooking. Help with recipe finding, ingredient matching, etc. It doesn't do much better than I can do myself.
- Home maintenance. I've asked it a few questions about things around the house but the advice is always extremely generic.
- Business idea generating. I've asked it for business ideas around areas that I'm interested in, all super generic.
- Personal qualities/aptitude condensing. I put in: I'm good at this and this and this, what should I do with my life? Answers are pretty poor, generic.
- Mental health. I might put in: I'm feeling down today, what can I do about that? I already know I should go outside or do some exercise, but that's basically all ChatGPT tells me to do.
- Email writing: my wife is an elementary school teacher, who is often burdened by email. We've tried getting ChatGPT to write responses for her — they're never personal enough.
- I've asked for ideas for things to do with my kids, given their interests as a prompt. Again, nothing beyond what I've already thought of myself.
- I've thought about having ChatGPT write a story with my kids, but I'm much more interested in them playing outside than playing on a screen.
- Taxes. I've asked some basic questions but get totally unhelpful responses.
- Investing. I assume the entire system is programmed to be generic in this area. Lots of fun facts, but nothing actionable for sure.
At work
- Copyediting. It does this well, but not amazingly so. I ask it to rewrite sentences a lot, and it gives me ideas, but I still end up doing the work. It's ok for idea-generating I guess.
- Tutorials. I've tried guiding ChatGPT towards as a tutorial-style chat. It doesn't do this well, as it never starts at the right place.
- Keeping up with the ever-changing world of frontend development. It can tell me about different tools, but not about how they fit together, or where industry thinking is now.
- Tailwind. It doesn't seem to know Tailwind very well. It will only basic answers I already know, or will make up classes that don't exist.
- Next.JS. More the fault with Next itself because its moved so fast, but answers for most Next questions are generally outdated.
- React Server Components? No clue.
- App router? Yeah, no.
- Next.JS in general in 2023? Since current training data for ChatGPT4 goes up to 2021, it's just not very helpful with NextJS content at the moment.
- Is using useEffect a ton a bad idea? Not to ChatGPT.
- React Query setup and troubleshooting can be long-winded and tricky. ChatGPT is of very little help here.
- It doesn't answer meta questions well: should I use context for this? Should I write a hook for this? It can show you how to do those things, but not if or why you should.
- Typescript concepts seem technically correct, but explained in a way that's too complex. Youtubers tend to do a much better job of mapping new concepts to existing ones.
- Recruiting. Recruiting and finding candidates is still has hard as ever. No help on this front.
- Interviewing. I already have a bunch of great questions to ask candidates. Teasing out the answers is always the hard part.
- Tests. ChatGPT is great at writing unit tests, but not great for teaching testing basics. When to test, why. What to look for. Methodologies, philosophies, etc. These are the critical parts, not the writing of the actual tests.
Would I be less productive if ChatGPT were taken away from me? Absolutely. Has it completely transformed the way I work? Certainly not.
I don't have access to Github Copilot yet, so I'm curious how that will change things. At this moment, half-way through 2023, I find AI to be a very useful tool. But generally not more than that.
I wonder how this take will read in 2025.