Working with SQL
Managing Table Schemas
To create tables, and update them as they change over time, I use the package migrate.
Tables, indexes, and any other changes you want to make are written as .sql
files referred to as "migrations" and typically live within a package they are
most relevant to. For example, if we were making a note taking application,
I'd put these migrations in the directory
internal/
A migration file might look something like this:
internal/CREATE TABLE IF NOT EXISTS notes (
id TEXT PRIMARY KEY,
user_id TEXT NOT NULL,
title TEXT NOT NULL,
text TEXT NOT NULL,
date_created TIMESTAMP NOT NULL DEFAULT CURRENT_TIMESTAMP,
date_updated TIMESTAMP NOT NULL DEFAULT CURRENT_TIMESTAMP;
);
CREATE INDEX IF NOT EXISTS idx_notes_user_id ON notes(user_id);
The next time changes need to be made, such as adding a new column, you'd put
that in a new file
internal/
This is a time tested way of managing database schemas and works quite well for both:
- updating a production environment
- bootstrapping a fresh DB to run tests against
To run the migrations, I'll write (copy) the following migration function:
internal/package notes
import (
"context"
"database/sql"
"embed"
"errors"
"fmt"
"github.com/golang-migrate/migrate/v4"
"github.com/golang-migrate/migrate/v4/database/postgres"
"github.com/golang-migrate/migrate/v4/source/iofs"
)
var (
//go:embed migrations/*.sql
migrationFiles embed.FS
)
func RunMigrations(ctx context.Context, db *sql.DB) error {
sourceDriver, err := iofs.New(migrationFiles, "migrations")
if err != nil {
return fmt.Errorf("error creating sourceDriver for notes DB migrations: %w", err)
}
dbDriver, err := postgres.WithInstance(db, &postgres.Config{
MigrationsTable: "schema_migrations_notes",
})
if err != nil {
return fmt.Errorf("error creating dbDriver for notes DB migrations: %w", err)
}
migrator, err := migrate.NewWithInstance("iofs", sourceDriver, "postgres", dbDriver)
if err != nil {
return fmt.Errorf("error initializing notes DB migrations: %w", err)
}
err = migrator.Up()
if err != nil && !errors.Is(err, migrate.ErrNoChange) {
return err
}
return nil
}
You can invoke this function wherever you'd like. For me, this is almost always done in the main.go file for the application that I would consider to be the "owner" of this db.
Using SQL with Go
I don't have a one-size-fits-all solution here. In fact, if we count the standard library, I consistently use 3 different packages, depending on the situation.
One thing that I almost always do is hide DB reads and writes behind a struct that can implement an interface. In the examples that follow, we'll be hanging functions off of the following struct:
type Store struct {
db *sql.DB
}
func NewStore(db *sql.DB) *Store {
return &Store{
db: db,
}
}
Standard Library
For simple stuff, the standard library works just fine.
func (s *Store) GetNote(ctx context.Context, id string) (Note, error) {
query := `SELECT id, user_id, title, text, date_created, date_updated
FROM notes WHERE id = $1`
var note Note
err := s.db.QueryRowContext(ctx, query, id).Scan(¬e.Id, ¬e.UserId, ¬e.Title, ¬e.Text, ¬e.DateCreated, ¬e.DateUpdated)
return note, err
}
func (s *Store) DeleteNote(ctx context.Context, id string) error {
query := `DELETE FROM notes WHERE id = $1`
_, err := s.db.ExecContext(ctx, query, id)
return err
}
One of the biggest turn offs that developers have when using Go for the first time is the "order matters" nature of SQL argument binding and row scanning (as seen in the select example above). It's actually not that bad once you get used to it, but you'd be correct in thinking that it can get a bit messy as queries and data types become larger and more complex. This brings us to the next options...
sqlx
When you potentially have a lot of arguments to bind, highlighted here by a "bulk upsert" function, the sqlx package can give you a nice declarative way of working.
type Note struct {
Id string `db:"id"`
UserId string `db:"user_id"`
Title string `db:"title"`
Text string `db:"text"`
DateCreated time.Time `db:"date_created"`
DateUpdated time.Time `db:"date_updated"`
}
func (s *Store) UpsertNotes(ctx context.Context, notes []Note) error {
db := sqlx.NewDb(s.db, "postgres")
if len(notes) == 0 {
return nil
}
query := `
INSERT INTO notes (id, user_id, title, text, date_created, date_updated)
VALUES (:id, :user_id, :title, :text, :date_created, :date_updated)
ON CONFLICT (id)
DO UPDATE SET
title = EXCLUDED.title,
text = EXCLUDED.text,
date_updated = CURRENT_TIMESTAMP
`
_, err := db.NamedExecContext(ctx, query, notes)
return err
}
squirrel
For cases where you need to build up a dynamic where clause, I've had a really good time with the squirrel package.
type ListNotesFilters struct {
Ids []string
UserIds []string
}
func (o *Store) ListNotes(ctx context.Context, filters ListNotesFilters) ([]Note, error) {
db := sqlx.NewDb(o.db, "postgres")
builder := sq.Select("*").From("notes")
if len(filters.Ids) > 0 {
builder = builder.Where(sq.Eq{"id": filters.Ids})
}
if len(filters.UserIds) > 0 {
builder = builder.Where(sq.Eq{"user_id": filters.UserIds})
}
query, args, err := builder.PlaceholderFormat(sq.Dollar).ToSql()
if err != nil {
return nil, err
}
var notes []Note
err = db.SelectContext(ctx, ¬es, query, args...)
return notes, nil
}
This is a simple example, but squirrel also works really well for more complex use cases, like dynamically adding joins and subqueries.
A very cool thing that I'll call out here is that squirrel handles the query building and then returns basics types for the sql query and arguments. This means that it pairs very nicely with the standard library or sqlx.