I am writing a sort of a REST-like API in Go. The database I have is legacy, so I don't control the field names or the structure or for that matter anything. I am abstracting database access methods in a separate package called datastore. The code I have looks like this (skipping all the error handling etc.)
type Datastore struct {}
type Object struct {
id uint
name string
... zillion more fields here
}
func (Datastore) ObjectList() {
var objects *[]Object
db, _ := sqlx.Open("postgres", conn_info)
rows, _ := sqlx.Queryx("SELECT * FROM object_table")
defer rows.Close()
for rows.Next() {
var obj Object
rows.Scan(&obj.id, &obj.name)
objects = append(objects, obj)
}
return objects
}
The problem I am currently having is that the object table has dozens and dozens of fields. Some I care about, but some I do not. Some are named the same as Object and some are not. Eventually, I will need to support most of them, but I am working on a proof of concept first. It seems that the Scan code fails if it finds more fields in the row than are present in the arguments for Scan(). I could list the fields I scan for in the query select id, name from object_table, but
1. it makes the code extremely ugly (SQL doesn't get formatted right by gofmt)
2. it adds another place I need to edit when I want to support another field
Is there any way to implement a custom scanner interface that would take the rows object, load some data out of it into a struct and ignore the rest?