Why I Built a SQL to C# POCO Generator
Quick — what's the C# type for UNIQUEIDENTIFIER?
Guid. You knew that. What about MONEY? Also decimal. BIT? bool. DECIMAL(18,2)? decimal again. And if the column is nullable, every one of those becomes DateTime?, int?, bool?. Miss one nullable marker and your app crashes at runtime when the database returns NULL.
You know these mappings. You've known them for years. And yet every time you sit down to create a POCO from a SQL table, you still find yourself checking. Column by column. Fifteen columns. Thirty tables. An afternoon gone.
The math doesn't lie
I inherited a project last year with 30+ tables and no model classes. The previous developer used raw ADO.NET with string column names everywhere. Step one: create POCO classes for every table. I opened SSMS on one monitor, Visual Studio on the other, and started typing.
Three tables in, I realized I was doing pure mechanical translation. The SQL file already contains everything — column names, types, sizes, nullability. There's zero creative decision involved. I was acting as a human compiler, and a slow one at that.
One file in, one class out
SqlToPocoGenerator.exe Customer.sql
That's it. The tool reads the CREATE TABLE statement, maps every column to the correct C# type, handles nullability, and writes a clean class file. Optionally pass a namespace: SqlToPocoGenerator.exe Customer.sql MyApp.Data.
It handles all common SQL Server types. Unknown types default to string as a safe fallback — you'll see it in the output and know to fix it, which is better than a wrong type silently compiling.
The AI question
Yes, you can paste a CREATE TABLE statement into ChatGPT and get a POCO. I've done it myself. For a single table, that works fine.
For 30 tables, you're copying and pasting 30 times into a chat window, reformatting each response, and hoping the AI doesn't randomly decide to add data annotations or Entity Framework attributes you didn't ask for. A CLI that does the same thing the same way every time is faster for batch work.
Also: not everyone can paste their production database schema into an AI chat. Some companies have policies about that. Some developers just prefer tools that run locally and don't send data anywhere.
Second product, same lesson
This shipped on the same day as Product #3. By this point, I had a rhythm: identify the pattern, lock the scope to v0.1 (SQL Server only, one table per run, no relationships), build it, ship it.
The biggest decision was decoupling from any specific ORM. Product #1 was NHibernate-specific. This one generates plain POCOs — works with Dapper, Entity Framework, NHibernate, raw ADO.NET, whatever you use. Wider audience, same effort.
Two products in two weeks. The momentum was real.
Available on Gumroad for EUR 10.