Add SQL object expansion serialization support

This commit is contained in:
Sergey Chernov 2026-04-25 16:09:30 +03:00
parent 50e34e520e
commit 92e9325f40
6 changed files with 1128 additions and 10 deletions

View File

@ -200,6 +200,27 @@ assertThrows(RollbackException) {
- `execute(clause, params...)` — execute a side-effect statement and return `ExecutionResult`. - `execute(clause, params...)` — execute a side-effect statement and return `ExecutionResult`.
- `transaction(block)` — nested transaction with real savepoint semantics. - `transaction(block)` — nested transaction with real savepoint semantics.
`select(...)` and `execute(...)` also support SQL object-expansion macros for declaration-driven writes:
- `@cols(?1)` — expand object argument `?1` to a comma-separated column list
- `@vals(?1)` — expand object argument `?1` to matching placeholders and bind values
- `@set(?1)` — expand object argument `?1` to `column = ?` pairs and bind values
Each macro also supports an optional clause-local exclusion list:
```lyng
tx.execute("update item set @set(?1 except: \"id\", \"createdAt\") where id = ?2", item, item.id)
```
Example:
```lyng
tx.execute("insert into item(@cols(?1)) values(@vals(?1))", item)
tx.execute("update item set @set(?1) where id = ?2", item, item.id)
```
When a clause uses any of these macros, non-expanded scalar parameters in the same SQL string must use explicit indexed placeholders such as `?2`, `?3`, and so on.
##### `ResultSet` ##### `ResultSet`
- `columns` — positional `SqlColumn` metadata, available before iteration. - `columns` — positional `SqlColumn` metadata, available before iteration.
@ -219,13 +240,15 @@ Name-based access fails with `SqlUsageException` if the name is missing or ambig
##### `DbFieldAdapter` ##### `DbFieldAdapter`
Custom DB field projection hook used by `@DbDecodeWith(...)`. Custom DB field projection hook used by `@DbDecodeWith(...)` and `@DbSerializeWith(...)`.
- `decode(rawValue, column, row, targetType)` — adapt one raw DB field value to a Lyng value for the requested target type. - `decode(rawValue, column, row, targetType)` — adapt one raw DB field value to a Lyng value for the requested target type.
- `encode(value, targetType)`future symmetric hook for SQL parameter encoding. - `encode(value, targetType)`adapt one Lyng value to a direct DB-bindable value for SQL object expansion.
Use `@DbDecodeWith(adapter)` on class constructor parameters and class-body fields/properties that participate in `decodeAs<T>()`. Use `@DbDecodeWith(adapter)` on class constructor parameters and class-body fields/properties that participate in `decodeAs<T>()`.
Use `@DbSerializeWith(adapter)` on constructor parameters and class-body fields/properties that participate in `@cols(...)`, `@vals(...)`, and `@set(...)` object expansion.
Annotation arguments are evaluated once when the declaration is created, and the resulting adapter instance is retained in declaration metadata. Annotation arguments are evaluated once when the declaration is created, and the resulting adapter instance is retained in declaration metadata.
##### `ExecutionResult` ##### `ExecutionResult`
@ -250,6 +273,88 @@ Portable bind values:
Unsupported parameter values fail with `SqlUsageException`. Unsupported parameter values fail with `SqlUsageException`.
SQL object-expansion write rules:
- constructor parameters participate in projection by declaration order
- matching serializable class-body fields/properties also participate
- `@Transient` fields are excluded automatically
- `@DbExcept` fields are excluded automatically
- `except:` excludes additional fields for one specific macro use
- direct DB-bindable values are written as-is
- `@DbJson` fields are encoded as canonical JSON text
- `@DbLynon` fields are encoded as Lynon binary
- `@DbSerializeWith(adapter)` fields are encoded through the adapter
- unannotated non-bindable object fields fail with `SqlUsageException`
Write-side encoding is intentionally explicit. The runtime does not try to infer target DB column types from SQL text or backend metadata during statement preparation.
Example:
```lyng
import lyng.io.db
import lyng.io.db.sqlite
class Payload(name: String, count: Int)
object TrimAdapter: DbFieldAdapter {
override fun encode(value, targetType) =
when(value) {
null -> null
else -> value.toString().trim()
}
}
class Item(
id: Int,
@DbSerializeWith(TrimAdapter) title: String,
@DbJson meta: Payload,
@DbLynon state: Payload
) {
var note: String = ""
@DbExcept var cache: String = ""
}
val db = openSqlite(":memory:")
val restored = db.transaction { tx ->
tx.execute(
"create table item(id integer not null, title text not null, meta text not null, state blob not null, note text not null)"
)
val item = Item(1, " first ", Payload("json", 10), Payload("bin", 20))
item.note = "created"
item.cache = "not stored"
tx.execute("insert into item(@cols(?1)) values(@vals(?1))", item)
item.title = " second "
item.meta = Payload("json2", 11)
item.state = Payload("bin2", 21)
item.note = "updated"
tx.execute(
"update item set @set(?1 except: \"id\") where id = ?2",
item,
item.id
)
tx.select("select id, title, meta, state, note from item").decodeAs<Item>().first
}
assertEquals("second", restored.title)
assertEquals("json2", restored.meta.name)
assertEquals(21, restored.state.count)
assertEquals("updated", restored.note)
```
This example shows:
- `@DbSerializeWith(...)` trimming a string before write
- `@DbJson` storing structured data in a text column
- `@DbLynon` storing structured data in a binary column
- `@DbExcept` excluding a field from automatic projection
- `@set(... except: "id")` skipping one field for an update clause
- `decodeAs<Item>()` reconstructing the object on read
Portable result metadata categories: Portable result metadata categories:
- `Binary` - `Binary`

View File

@ -36,6 +36,9 @@ import net.sergeych.lyng.obj.ObjInstance
import net.sergeych.lyng.obj.ObjInt import net.sergeych.lyng.obj.ObjInt
import net.sergeych.lyng.obj.ObjNull import net.sergeych.lyng.obj.ObjNull
import net.sergeych.lyng.obj.ObjRecord import net.sergeych.lyng.obj.ObjRecord
import net.sergeych.lyng.obj.ObjDateTime
import net.sergeych.lyng.obj.ObjInstant
import net.sergeych.lyng.obj.ObjReal
import net.sergeych.lyng.obj.ObjString import net.sergeych.lyng.obj.ObjString
import net.sergeych.lyng.obj.ObjTypeExpr import net.sergeych.lyng.obj.ObjTypeExpr
import net.sergeych.lyng.obj.ObjVoid import net.sergeych.lyng.obj.ObjVoid
@ -227,16 +230,16 @@ internal class SqlRuntimeTypes private constructor(
self.lifetime.ensureActive(this) self.lifetime.ensureActive(this)
val clause = (args.list.getOrNull(0) as? ObjString)?.value val clause = (args.list.getOrNull(0) as? ObjString)?.value
?: raiseClassCastError("query must be String") ?: raiseClassCastError("query must be String")
val params = args.list.drop(1) val prepared = prepareSqlClause(requireScope(), self.types, clause, args.list.drop(1))
SqlResultSetObj(self.types, self.lifetime, self.backend.select(this, clause, params)) SqlResultSetObj(self.types, self.lifetime, self.backend.select(this, prepared.clause, prepared.params))
} }
transactionClass.addFn("execute") { transactionClass.addFn("execute") {
val self = thisAs<SqlTransactionObj>() val self = thisAs<SqlTransactionObj>()
self.lifetime.ensureActive(this) self.lifetime.ensureActive(this)
val clause = (args.list.getOrNull(0) as? ObjString)?.value val clause = (args.list.getOrNull(0) as? ObjString)?.value
?: raiseClassCastError("query must be String") ?: raiseClassCastError("query must be String")
val params = args.list.drop(1) val prepared = prepareSqlClause(requireScope(), self.types, clause, args.list.drop(1))
SqlExecutionResultObj(self.types, self.lifetime, self.backend.execute(this, clause, params)) SqlExecutionResultObj(self.types, self.lifetime, self.backend.execute(this, prepared.clause, prepared.params))
} }
transactionClass.addFn("transaction") { transactionClass.addFn("transaction") {
val self = thisAs<SqlTransactionObj>() val self = thisAs<SqlTransactionObj>()
@ -659,6 +662,10 @@ private suspend fun decodeSqlValue(
} }
return adapted return adapted
} }
val explicitEncoding = findExplicitDbEncodingAnnotation(scope, types, annotations)
if (explicitEncoding != null) {
return decodeExplicitDbEncoding(scope, types, row, column, value, targetType, explicitEncoding)
}
if (value === ObjNull) { if (value === ObjNull) {
if (targetType.isNullable || targetType == TypeDecl.TypeNullableAny) return ObjNull if (targetType.isNullable || targetType == TypeDecl.TypeNullableAny) return ObjNull
raiseSqlUsage(scope, types, "SQL column '${column.name}' is null but target type ${renderTypeName(targetType)} is non-null") raiseSqlUsage(scope, types, "SQL column '${column.name}' is null but target type ${renderTypeName(targetType)} is non-null")
@ -715,6 +722,18 @@ private fun findDbDecodeWithAnnotation(
return matches.singleOrNull() return matches.singleOrNull()
} }
private fun findExplicitDbEncodingAnnotation(
scope: Scope,
types: SqlRuntimeTypes,
annotations: List<DeclAnnotation>,
): DeclAnnotation? {
val matches = annotations.filter { it.name == "DbJson" || it.name == "DbLynon" || it.name == "DbSerializeWith" }
if (matches.size > 1) {
raiseSqlUsage(scope, types, "Only one of @DbJson, @DbLynon, or @DbSerializeWith(...) is allowed per declaration")
}
return matches.singleOrNull()
}
private suspend fun applyDbFieldAdapter( private suspend fun applyDbFieldAdapter(
scope: Scope, scope: Scope,
types: SqlRuntimeTypes, types: SqlRuntimeTypes,
@ -723,13 +742,14 @@ private suspend fun applyDbFieldAdapter(
value: Obj, value: Obj,
targetType: TypeDecl, targetType: TypeDecl,
annotation: DeclAnnotation, annotation: DeclAnnotation,
annotationName: String = annotation.name,
): Obj { ): Obj {
if (annotation.named.isNotEmpty() || annotation.positional.size != 1) { if (annotation.named.isNotEmpty() || annotation.positional.size != 1) {
raiseSqlUsage(scope, types, "@DbDecodeWith(...) expects exactly one adapter instance argument") raiseSqlUsage(scope, types, "@$annotationName(...) expects exactly one adapter instance argument")
} }
val adapter = annotation.positional.first() val adapter = annotation.positional.first()
if (!adapter.isInstanceOf(types.core.dbFieldAdapterClass)) { if (!adapter.isInstanceOf(types.core.dbFieldAdapterClass)) {
raiseSqlUsage(scope, types, "@DbDecodeWith(...) argument must implement DbFieldAdapter") raiseSqlUsage(scope, types, "@$annotationName(...) argument must implement DbFieldAdapter")
} }
return try { return try {
adapter.invokeInstanceMethod( adapter.invokeInstanceMethod(
@ -746,11 +766,83 @@ private suspend fun applyDbFieldAdapter(
raiseSqlUsage( raiseSqlUsage(
scope, scope,
types, types,
"Failed to decode SQL column '${column.name}' with @DbDecodeWith(...): ${e.message ?: e::class.simpleName}" "Failed to decode SQL column '${column.name}' with @$annotationName(...): ${e.message ?: e::class.simpleName}"
) )
} }
} }
private suspend fun decodeExplicitDbEncoding(
scope: Scope,
types: SqlRuntimeTypes,
row: SqlRowObj,
column: SqlColumnMeta,
value: Obj,
targetType: TypeDecl,
annotation: DeclAnnotation,
): Obj {
when (annotation.name) {
"DbJson", "DbLynon" -> if (annotation.positional.isNotEmpty() || annotation.named.isNotEmpty()) {
raiseSqlUsage(scope, types, "@${annotation.name} does not take arguments")
}
}
if (value === ObjNull) {
if (targetType.isNullable || targetType == TypeDecl.TypeNullableAny) return ObjNull
raiseSqlUsage(scope, types, "SQL column '${column.name}' is null but target type ${renderTypeName(targetType)} is non-null")
}
return when (annotation.name) {
"DbJson" -> {
val text = value as? ObjString
?: raiseSqlUsage(scope, types, "@DbJson expects SQL column '${column.name}' to be String, got ${value.objClass.className}")
try {
ObjJsonClass.decodeFromJsonElement(scope, Json.parseToJsonElement(text.value), targetType)
} catch (e: Throwable) {
raiseSqlUsage(
scope,
types,
"Failed to decode JSON column '${column.name}' as ${renderTypeName(targetType)}: ${e.message ?: e::class.simpleName}"
)
}
}
"DbLynon" -> {
val payload = value as? ObjBuffer
?: raiseSqlUsage(scope, types, "@DbLynon expects SQL column '${column.name}' to be Binary, got ${value.objClass.className}")
try {
val decoded = ObjLynonClass.decodeAny(scope, ObjBitBuffer(BitArray(payload.byteArray, 8)))
if (!matchesTypeDeclCompat(scope, decoded, targetType)) {
raiseSqlUsage(
scope,
types,
"Lynon-decoded SQL column '${column.name}' does not match target type ${renderTypeName(targetType)}"
)
}
decoded
} catch (e: Throwable) {
raiseSqlUsage(
scope,
types,
"Failed to decode Lynon column '${column.name}' as ${renderTypeName(targetType)}: ${e.message ?: e::class.simpleName}"
)
}
}
"DbSerializeWith" -> {
val adapted = applyDbFieldAdapter(scope, types, row, column, value, targetType, annotation, "DbSerializeWith")
if (adapted === ObjNull) {
if (targetType.isNullable || targetType == TypeDecl.TypeNullableAny) return ObjNull
raiseSqlUsage(scope, types, "SQL column '${column.name}' is null but target type ${renderTypeName(targetType)} is non-null")
}
if (!matchesTypeDeclCompat(scope, adapted, targetType)) {
raiseSqlUsage(
scope,
types,
"DB adapter result for column '${column.name}' does not match target type ${renderTypeName(targetType)}"
)
}
adapted
}
else -> raiseSqlUsage(scope, types, "Unsupported DB field decoding annotation: @${annotation.name}")
}
}
private fun runtimeTargetTypeObject(scope: Scope, targetType: TypeDecl): Obj { private fun runtimeTargetTypeObject(scope: Scope, targetType: TypeDecl): Obj {
return resolveTypeDeclClass(scope, targetType) ?: ObjTypeExpr(targetType) return resolveTypeDeclClass(scope, targetType) ?: ObjTypeExpr(targetType)
} }
@ -824,3 +916,500 @@ private fun raiseSqlUsage(scope: Scope, types: SqlRuntimeTypes?, message: String
} }
scope.raiseIllegalArgument(message) scope.raiseIllegalArgument(message)
} }
private data class PreparedSqlClause(
val clause: String,
val params: List<Obj>,
)
private data class SqlProjectionField(
val name: String,
val value: Obj,
val targetType: TypeDecl,
val annotations: List<DeclAnnotation>,
)
private data class SqlProjectionFilter(
val excludedFields: Set<String> = emptySet(),
)
private enum class SqlMacroKind {
Cols,
Vals,
Set
}
private suspend fun prepareSqlClause(
scope: Scope,
types: SqlRuntimeTypes,
clause: String,
params: List<Obj>,
): PreparedSqlClause {
if (!clause.contains("@")) return PreparedSqlClause(clause, params)
val explicitRefs = findExplicitSqlArgumentRefs(scope, types, clause)
val output = StringBuilder(clause.length + 32)
val boundParams = mutableListOf<Obj>()
var cursor = 0
var legacySequentialIndex = 0
var sawMacro = false
while (cursor < clause.length) {
val macro = parseSqlMacroAt(scope, types, clause, cursor)
if (macro != null) {
sawMacro = true
val projection = buildSqlProjection(scope, types, params, macro.paramIndex, macro.filter)
output.append(
when (macro.kind) {
SqlMacroKind.Cols -> projection.joinToString(", ") { it.name }
SqlMacroKind.Vals -> projection.joinToString(", ") { "?" }
SqlMacroKind.Set -> projection.joinToString(", ") { "${it.name} = ?" }
}
)
if (macro.kind != SqlMacroKind.Cols) {
for (field in projection) {
boundParams += encodeProjectedDbField(scope, types, field)
}
}
cursor = macro.endExclusive
continue
}
val indexed = parseIndexedPlaceholderAt(clause, cursor)
if (indexed != null) {
output.append('?')
boundParams += resolveSqlArgument(scope, types, params, indexed.paramIndex)
cursor = indexed.endExclusive
continue
}
val ch = clause[cursor]
if (ch == '\'') {
val end = skipSqlSingleQuotedString(clause, cursor)
output.append(clause, cursor, end)
cursor = end
continue
}
if (ch == '"') {
val end = skipSqlDoubleQuotedIdentifier(clause, cursor)
output.append(clause, cursor, end)
cursor = end
continue
}
if (ch == '-' && cursor + 1 < clause.length && clause[cursor + 1] == '-') {
val end = skipSqlLineComment(clause, cursor)
output.append(clause, cursor, end)
cursor = end
continue
}
if (ch == '/' && cursor + 1 < clause.length && clause[cursor + 1] == '*') {
val end = skipSqlBlockComment(clause, cursor)
output.append(clause, cursor, end)
cursor = end
continue
}
if (ch == '?') {
if (sawMacro) {
raiseSqlUsage(
scope,
types,
"SQL clauses using @cols/@vals/@set must use explicit indexed placeholders like ?1, ?2 for non-expanded parameters"
)
}
output.append('?')
if (legacySequentialIndex >= params.size) {
raiseSqlUsage(scope, types, "SQL parameter count mismatch: statement expects more values than provided")
}
boundParams += params[legacySequentialIndex++]
cursor++
continue
}
output.append(ch)
cursor++
}
if (!sawMacro) return PreparedSqlClause(clause, params)
val unreferencedIndexed = (1..params.size).filter { it !in explicitRefs }
if (unreferencedIndexed.isNotEmpty()) {
raiseSqlUsage(
scope,
types,
"Unused SQL argument(s) in macro clause: ${unreferencedIndexed.joinToString(", ") { "?$it" }}"
)
}
return PreparedSqlClause(output.toString(), boundParams)
}
private fun findExplicitSqlArgumentRefs(
scope: Scope,
types: SqlRuntimeTypes,
clause: String,
): Set<Int> {
val refs = linkedSetOf<Int>()
var cursor = 0
while (cursor < clause.length) {
val macro = parseSqlMacroAt(scope, types, clause, cursor)
if (macro != null) {
refs += macro.paramIndex
cursor = macro.endExclusive
continue
}
val indexed = parseIndexedPlaceholderAt(clause, cursor)
if (indexed != null) {
refs += indexed.paramIndex
cursor = indexed.endExclusive
continue
}
val ch = clause[cursor]
cursor = when {
ch == '\'' -> skipSqlSingleQuotedString(clause, cursor)
ch == '"' -> skipSqlDoubleQuotedIdentifier(clause, cursor)
ch == '-' && cursor + 1 < clause.length && clause[cursor + 1] == '-' -> skipSqlLineComment(clause, cursor)
ch == '/' && cursor + 1 < clause.length && clause[cursor + 1] == '*' -> skipSqlBlockComment(clause, cursor)
else -> cursor + 1
}
}
return refs
}
private data class ParsedSqlMacro(
val kind: SqlMacroKind,
val paramIndex: Int,
val filter: SqlProjectionFilter,
val endExclusive: Int,
)
private data class ParsedIndexedPlaceholder(
val paramIndex: Int,
val endExclusive: Int,
)
private fun parseSqlMacroAt(
scope: Scope,
types: SqlRuntimeTypes,
clause: String,
start: Int,
): ParsedSqlMacro? {
if (clause[start] != '@') return null
val kinds = listOf(
"cols" to SqlMacroKind.Cols,
"vals" to SqlMacroKind.Vals,
"set" to SqlMacroKind.Set,
)
for ((name, kind) in kinds) {
if (!clause.startsWith("@$name", start)) continue
var cursor = start + name.length + 1
while (cursor < clause.length && clause[cursor].isWhitespace()) cursor++
if (cursor >= clause.length || clause[cursor] != '(') {
raiseSqlUsage(scope, types, "Malformed SQL macro @$name: expected '('")
}
cursor++
while (cursor < clause.length && clause[cursor].isWhitespace()) cursor++
if (cursor >= clause.length || clause[cursor] != '?') {
raiseSqlUsage(scope, types, "Malformed SQL macro @$name(...): expected indexed argument like ?1")
}
cursor++
val numberStart = cursor
while (cursor < clause.length && clause[cursor].isDigit()) cursor++
if (numberStart == cursor) {
raiseSqlUsage(scope, types, "Malformed SQL macro @$name(...): expected indexed argument like ?1")
}
val paramIndex = clause.substring(numberStart, cursor).toInt()
val excludedFields = linkedSetOf<String>()
while (cursor < clause.length && clause[cursor].isWhitespace()) cursor++
if (cursor < clause.length && clause.startsWith("except", cursor)) {
val afterKeyword = cursor + "except".length
if (afterKeyword < clause.length && isSqlMacroFilterIdentifierPart(clause[afterKeyword])) {
raiseSqlUsage(scope, types, "Malformed SQL macro @$name(...): expected ')' or 'except:'")
}
cursor = afterKeyword
while (cursor < clause.length && clause[cursor].isWhitespace()) cursor++
if (cursor >= clause.length || clause[cursor] != ':') {
raiseSqlUsage(scope, types, "Malformed SQL macro @$name(...): expected 'except:'")
}
cursor++
while (cursor < clause.length && clause[cursor].isWhitespace()) cursor++
var parsedAny = false
while (cursor < clause.length) {
val parsedField = parseSqlMacroFilterFieldName(clause, cursor) ?: break
excludedFields += parsedField.name.lowercase()
cursor = parsedField.endExclusive
parsedAny = true
while (cursor < clause.length && clause[cursor].isWhitespace()) cursor++
if (cursor < clause.length && clause[cursor] == ',') {
cursor++
while (cursor < clause.length && clause[cursor].isWhitespace()) cursor++
continue
}
break
}
if (!parsedAny) {
raiseSqlUsage(scope, types, "Malformed SQL macro @$name(...): 'except:' must list at least one field name")
}
}
while (cursor < clause.length && clause[cursor].isWhitespace()) cursor++
if (cursor >= clause.length || clause[cursor] != ')') {
raiseSqlUsage(scope, types, "Malformed SQL macro @$name(...): expected ')'")
}
return ParsedSqlMacro(kind, paramIndex, SqlProjectionFilter(excludedFields), cursor + 1)
}
return null
}
private fun parseIndexedPlaceholderAt(clause: String, start: Int): ParsedIndexedPlaceholder? {
if (clause[start] != '?') return null
if (start + 1 >= clause.length || !clause[start + 1].isDigit()) return null
var cursor = start + 1
while (cursor < clause.length && clause[cursor].isDigit()) cursor++
return ParsedIndexedPlaceholder(clause.substring(start + 1, cursor).toInt(), cursor)
}
private fun isSqlMacroFilterIdentifierStart(ch: Char): Boolean = ch == '_' || ch.isLetter()
private fun isSqlMacroFilterIdentifierPart(ch: Char): Boolean =
ch == '_' || ch.isLetterOrDigit()
private data class ParsedSqlMacroFilterField(
val name: String,
val endExclusive: Int,
)
private fun parseSqlMacroFilterFieldName(clause: String, start: Int): ParsedSqlMacroFilterField? {
if (start >= clause.length) return null
val quote = clause[start]
if (quote == '"' || quote == '\'') {
var cursor = start + 1
val out = StringBuilder()
while (cursor < clause.length) {
val ch = clause[cursor]
if (ch == quote) {
if (cursor + 1 < clause.length && clause[cursor + 1] == quote) {
out.append(quote)
cursor += 2
continue
}
return ParsedSqlMacroFilterField(out.toString(), cursor + 1)
}
out.append(ch)
cursor++
}
return null
}
if (!isSqlMacroFilterIdentifierStart(clause[start])) return null
var cursor = start + 1
while (cursor < clause.length && isSqlMacroFilterIdentifierPart(clause[cursor])) cursor++
return ParsedSqlMacroFilterField(clause.substring(start until cursor), cursor)
}
private fun skipSqlSingleQuotedString(clause: String, start: Int): Int {
var cursor = start + 1
while (cursor < clause.length) {
if (clause[cursor] == '\'') {
if (cursor + 1 < clause.length && clause[cursor + 1] == '\'') {
cursor += 2
continue
}
return cursor + 1
}
cursor++
}
return clause.length
}
private fun skipSqlDoubleQuotedIdentifier(clause: String, start: Int): Int {
var cursor = start + 1
while (cursor < clause.length) {
if (clause[cursor] == '"') {
if (cursor + 1 < clause.length && clause[cursor + 1] == '"') {
cursor += 2
continue
}
return cursor + 1
}
cursor++
}
return clause.length
}
private fun skipSqlLineComment(clause: String, start: Int): Int {
var cursor = start + 2
while (cursor < clause.length && clause[cursor] != '\n') cursor++
return cursor
}
private fun skipSqlBlockComment(clause: String, start: Int): Int {
var cursor = start + 2
while (cursor + 1 < clause.length) {
if (clause[cursor] == '*' && clause[cursor + 1] == '/') return cursor + 2
cursor++
}
return clause.length
}
private fun resolveSqlArgument(
scope: Scope,
types: SqlRuntimeTypes,
params: List<Obj>,
oneBasedIndex: Int,
): Obj {
if (oneBasedIndex <= 0 || oneBasedIndex > params.size) {
raiseSqlUsage(scope, types, "SQL parameter reference ?$oneBasedIndex is out of range for ${params.size} argument(s)")
}
return params[oneBasedIndex - 1]
}
private suspend fun buildSqlProjection(
scope: Scope,
types: SqlRuntimeTypes,
params: List<Obj>,
oneBasedIndex: Int,
filter: SqlProjectionFilter = SqlProjectionFilter(),
): List<SqlProjectionField> {
val source = resolveSqlArgument(scope, types, params, oneBasedIndex)
val instance = source as? ObjInstance
?: raiseSqlUsage(scope, types, "SQL object expansion expects argument ?$oneBasedIndex to be an object instance, got ${source.objClass.className}")
val projected = mutableListOf<SqlProjectionField>()
val seen = linkedSetOf<String>()
val meta = instance.objClass.constructorMeta
if (meta != null) {
for (param in meta.params) {
if (param.isTransient || hasDbExceptAnnotation(param.annotations)) continue
if (!seen.add(param.name.lowercase())) {
raiseSqlUsage(scope, types, "Ambiguous SQL projection field '${param.name}' in ${instance.objClass.className}")
}
projected += SqlProjectionField(
name = param.name,
value = instance.readField(scope, param.name).value,
targetType = param.type,
annotations = param.annotations
)
}
}
for ((storageName, record) in instance.serializingVars) {
val name = storageName.substringAfterLast("::")
val annotations = instance.objClass.getInstanceMemberOrNull(name)?.annotations ?: emptyList()
if (hasDbExceptAnnotation(annotations)) continue
if (!seen.add(name.lowercase())) {
raiseSqlUsage(scope, types, "Ambiguous SQL projection field '$name' in ${instance.objClass.className}")
}
projected += SqlProjectionField(
name = name,
value = record.value,
targetType = record.typeDecl ?: TypeDecl.TypeAny,
annotations = annotations
)
}
if (projected.isEmpty()) {
raiseSqlUsage(scope, types, "SQL object expansion for ${instance.objClass.className} produced no projected fields")
}
if (filter.excludedFields.isEmpty()) {
return projected
}
val unknownFields = filter.excludedFields.filter { excluded ->
projected.none { it.name.lowercase() == excluded }
}
if (unknownFields.isNotEmpty()) {
raiseSqlUsage(
scope,
types,
"SQL object expansion for ${instance.objClass.className} can't exclude unknown field(s): ${unknownFields.joinToString(", ")}"
)
}
val filtered = projected.filter { it.name.lowercase() !in filter.excludedFields }
if (filtered.isEmpty()) {
raiseSqlUsage(scope, types, "SQL object expansion for ${instance.objClass.className} produced no projected fields after except:")
}
return filtered
}
private fun hasDbExceptAnnotation(annotations: List<DeclAnnotation>): Boolean =
annotations.any { it.name == "DbExcept" }
private suspend fun encodeProjectedDbField(
scope: Scope,
types: SqlRuntimeTypes,
field: SqlProjectionField,
): Obj {
if (field.value === ObjNull) return ObjNull
val explicit = findExplicitDbEncodingAnnotation(scope, types, field.annotations)
if (explicit != null) {
return encodeExplicitDbField(scope, types, field, explicit)
}
if (isDirectDbBindable(field.value)) return field.value
raiseSqlUsage(
scope,
types,
"Field '${field.name}' of ${field.value.objClass.className} requires explicit DB serialization policy (@DbJson, @DbLynon, or @DbSerializeWith(...))"
)
}
private suspend fun encodeExplicitDbField(
scope: Scope,
types: SqlRuntimeTypes,
field: SqlProjectionField,
annotation: DeclAnnotation,
): Obj {
return when (annotation.name) {
"DbJson" -> {
if (annotation.positional.isNotEmpty() || annotation.named.isNotEmpty()) {
raiseSqlUsage(scope, types, "@DbJson does not take arguments")
}
ObjString(ObjJsonClass.encodeToJsonElement(scope, field.value, field.targetType).toString())
}
"DbLynon" -> {
if (annotation.positional.isNotEmpty() || annotation.named.isNotEmpty()) {
raiseSqlUsage(scope, types, "@DbLynon does not take arguments")
}
ObjBuffer(ObjLynonClass.encodeAny(scope, field.value).bitArray.asUByteArray())
}
"DbSerializeWith" -> encodeWithDbFieldAdapter(scope, types, field, annotation)
else -> raiseSqlUsage(scope, types, "Unsupported DB field encoding annotation: @${annotation.name}")
}
}
private suspend fun encodeWithDbFieldAdapter(
scope: Scope,
types: SqlRuntimeTypes,
field: SqlProjectionField,
annotation: DeclAnnotation,
): Obj {
if (annotation.named.isNotEmpty() || annotation.positional.size != 1) {
raiseSqlUsage(scope, types, "@DbSerializeWith(...) expects exactly one adapter instance argument")
}
val adapter = annotation.positional.first()
if (!adapter.isInstanceOf(types.core.dbFieldAdapterClass)) {
raiseSqlUsage(scope, types, "@DbSerializeWith(...) argument must implement DbFieldAdapter")
}
val encoded = try {
adapter.invokeInstanceMethod(
scope,
"encode",
Arguments(field.value, runtimeTargetTypeObject(scope, field.targetType))
)
} catch (e: Throwable) {
raiseSqlUsage(
scope,
types,
"Failed to encode SQL field '${field.name}' with @DbSerializeWith(...): ${e.message ?: e::class.simpleName}"
)
}
if (!isDirectDbBindable(encoded)) {
raiseSqlUsage(
scope,
types,
"@DbSerializeWith(...) for field '${field.name}' must return a direct DB-bindable value, got ${encoded.objClass.className}"
)
}
return encoded
}
private fun isDirectDbBindable(value: Obj): Boolean = when (value) {
ObjNull -> true
is ObjBool, is ObjInt, is ObjReal, is ObjString, is ObjBuffer, is ObjInstant, is ObjDateTime -> true
else -> when (value.objClass.className) {
"Date", "Decimal" -> true
else -> false
}
}

View File

@ -361,6 +361,168 @@ class LyngSqliteModuleTest {
assertEquals("SqlUsageException", error.errorObject.objClass.className) assertEquals("SqlUsageException", error.errorObject.objClass.className)
} }
@Test
fun testSqlObjectExpansionInsertAndUpdateUseDbAnnotations() = runTest {
val scope = Script.newScope()
createSqliteModule(scope.importManager)
val code = """
import lyng.io.db.sqlite
class Payload(name: String, count: Int)
class Item(
id: Int,
title: String,
@DbJson meta: Payload,
@DbLynon state: Payload
) {
var note: String = ""
@DbExcept var cache: String = "skip"
@Transient var transientNote: String = "temp"
}
val db = openSqlite(":memory:")
db.transaction { tx ->
tx.execute("create table item(id integer not null, title text not null, meta text not null, state blob not null, note text not null)")
val item = Item(1, "first", Payload("json", 10), Payload("bin", 20))
item.note = "created"
tx.execute("insert into item(@cols(?1)) values(@vals(?1))", item)
item.title = "second"
item.meta = Payload("json2", 11)
item.state = Payload("bin2", 21)
item.note = "updated"
item.cache = "must-not-be-written"
item.transientNote = "must-not-be-written"
tx.execute("update item set @set(?1) where id = ?2", item, 1)
val restored = tx.select("select id, title, meta, state, note from item").decodeAs<Item>().first
assertEquals(1, restored.id)
assertEquals("second", restored.title)
assertEquals("json2", restored.meta.name)
assertEquals(11, restored.meta.count)
assertEquals("bin2", restored.state.name)
assertEquals(21, restored.state.count)
assertEquals("updated", restored.note)
restored.state.count
}
""".trimIndent()
val result = Compiler.compile(Source("<sqlite-object-expansion>", code), scope.importManager).execute(scope) as ObjInt
assertEquals(21L, result.value)
}
@Test
fun testSqlObjectExpansionSupportsDbSerializeWithAdapter() = runTest {
val scope = Script.newScope()
createSqliteModule(scope.importManager)
val code = """
import lyng.io.db
import lyng.io.db.sqlite
object TrimAdapter: DbFieldAdapter {
override fun encode(value, targetType) =
when(value) {
null -> null
else -> value.toString().trim()
}
}
class User(
id: Int,
@DbSerializeWith(TrimAdapter) name: String
)
val db = openSqlite(":memory:")
db.transaction { tx ->
tx.execute("create table user(id integer not null, name text not null)")
tx.execute("insert into user(@cols(?1)) values(@vals(?1))", User(7, " Alice "))
val row = tx.select("select id, name from user").first
val storedName: String = row["name"] as String
assertEquals(7, row["id"])
assertEquals("Alice", storedName)
storedName.size
}
""".trimIndent()
val result = Compiler.compile(Source("<sqlite-object-expansion-adapter>", code), scope.importManager).execute(scope) as ObjInt
assertEquals(5L, result.value)
}
@Test
fun testSqlObjectExpansionSupportsClauseLevelExceptFilter() = runTest {
val scope = Script.newScope()
createSqliteModule(scope.importManager)
val code = """
import lyng.io.db.sqlite
class Item(id: Int, title: String, note: String) {
var stamp: String = ""
@DbExcept var cache: String = "skip"
}
val db = openSqlite(":memory:")
db.transaction { tx ->
tx.execute("create table item(id integer not null, title text not null, note text not null, stamp text not null default '')")
val item = Item(1, "first", "keep")
item.stamp = "created"
tx.execute(
"insert into item(@cols(?1 except: \"stamp\")) values(@vals(?1 except: \"stamp\"))",
item
)
item.title = "second"
item.note = "changed-but-excluded"
item.stamp = "updated"
item.cache = "still-skip"
tx.execute(
"update item set @set(?1 except: \"id\", \"note\") where id = ?2",
item,
1
)
val restored = tx.select("select id, title, note, stamp from item").decodeAs<Item>().first
assertEquals(1, restored.id)
assertEquals("second", restored.title)
assertEquals("keep", restored.note)
assertEquals("updated", restored.stamp)
restored.stamp.size
}
""".trimIndent()
val result = Compiler.compile(Source("<sqlite-object-expansion-except>", code), scope.importManager).execute(scope) as ObjInt
assertEquals(7L, result.value)
}
@Test
fun testSqlObjectExpansionRejectsUnannotatedComplexFields() = runTest {
val scope = Script.newScope()
createSqliteModule(scope.importManager)
val code = """
import lyng.io.db.sqlite
class Payload(name: String)
class BadRecord(id: Int, payload: Payload)
val db = openSqlite(":memory:")
db.transaction { tx ->
tx.execute("create table bad_record(id integer not null, payload text not null)")
tx.execute("insert into bad_record(@cols(?1)) values(@vals(?1))", BadRecord(1, Payload("x")))
}
""".trimIndent()
val error = assertFailsWith<ExecutionError> {
Compiler.compile(Source("<sqlite-object-expansion-bad-field>", code), scope.importManager).execute(scope)
}
assertEquals("SqlUsageException", error.errorObject.objClass.className)
}
@Test @Test
fun testNestedTransactionRollbackUsesSavepoint() = runTest { fun testNestedTransactionRollbackUsesSavepoint() = runTest {
val scope = Script.newScope() val scope = Script.newScope()

View File

@ -457,6 +457,168 @@ class LyngSqliteModuleNativeTest {
assertEquals("SqlUsageException", error.errorObject.objClass.className) assertEquals("SqlUsageException", error.errorObject.objClass.className)
} }
@Test
fun testSqlObjectExpansionInsertAndUpdateUseDbAnnotations() = runTest {
val scope = Script.newScope()
createSqliteModule(scope.importManager)
val code = """
import lyng.io.db.sqlite
class Payload(name: String, count: Int)
class Item(
id: Int,
title: String,
@DbJson meta: Payload,
@DbLynon state: Payload
) {
var note: String = ""
@DbExcept var cache: String = "skip"
@Transient var transientNote: String = "temp"
}
val db = openSqlite(":memory:")
db.transaction { tx ->
tx.execute("create table item(id integer not null, title text not null, meta text not null, state blob not null, note text not null)")
val item = Item(1, "first", Payload("json", 10), Payload("bin", 20))
item.note = "created"
tx.execute("insert into item(@cols(?1)) values(@vals(?1))", item)
item.title = "second"
item.meta = Payload("json2", 11)
item.state = Payload("bin2", 21)
item.note = "updated"
item.cache = "must-not-be-written"
item.transientNote = "must-not-be-written"
tx.execute("update item set @set(?1) where id = ?2", item, 1)
val restored = tx.select("select id, title, meta, state, note from item").decodeAs<Item>().first
assertEquals(1, restored.id)
assertEquals("second", restored.title)
assertEquals("json2", restored.meta.name)
assertEquals(11, restored.meta.count)
assertEquals("bin2", restored.state.name)
assertEquals(21, restored.state.count)
assertEquals("updated", restored.note)
restored.state.count
}
""".trimIndent()
val result = Compiler.compile(Source("<sqlite-native-object-expansion>", code), scope.importManager).execute(scope) as ObjInt
assertEquals(21L, result.value)
}
@Test
fun testSqlObjectExpansionSupportsDbSerializeWithAdapter() = runTest {
val scope = Script.newScope()
createSqliteModule(scope.importManager)
val code = """
import lyng.io.db
import lyng.io.db.sqlite
object TrimAdapter: DbFieldAdapter {
override fun encode(value, targetType) =
when(value) {
null -> null
else -> value.toString().trim()
}
}
class User(
id: Int,
@DbSerializeWith(TrimAdapter) name: String
)
val db = openSqlite(":memory:")
db.transaction { tx ->
tx.execute("create table user(id integer not null, name text not null)")
tx.execute("insert into user(@cols(?1)) values(@vals(?1))", User(7, " Alice "))
val row = tx.select("select id, name from user").first
val storedName: String = row["name"] as String
assertEquals(7, row["id"])
assertEquals("Alice", storedName)
storedName.size
}
""".trimIndent()
val result = Compiler.compile(Source("<sqlite-native-object-expansion-adapter>", code), scope.importManager).execute(scope) as ObjInt
assertEquals(5L, result.value)
}
@Test
fun testSqlObjectExpansionSupportsClauseLevelExceptFilter() = runTest {
val scope = Script.newScope()
createSqliteModule(scope.importManager)
val code = """
import lyng.io.db.sqlite
class Item(id: Int, title: String, note: String) {
var stamp: String = ""
@DbExcept var cache: String = "skip"
}
val db = openSqlite(":memory:")
db.transaction { tx ->
tx.execute("create table item(id integer not null, title text not null, note text not null, stamp text not null default '')")
val item = Item(1, "first", "keep")
item.stamp = "created"
tx.execute(
"insert into item(@cols(?1 except: \"stamp\")) values(@vals(?1 except: \"stamp\"))",
item
)
item.title = "second"
item.note = "changed-but-excluded"
item.stamp = "updated"
item.cache = "still-skip"
tx.execute(
"update item set @set(?1 except: \"id\", \"note\") where id = ?2",
item,
1
)
val restored = tx.select("select id, title, note, stamp from item").decodeAs<Item>().first
assertEquals(1, restored.id)
assertEquals("second", restored.title)
assertEquals("keep", restored.note)
assertEquals("updated", restored.stamp)
restored.stamp.size
}
""".trimIndent()
val result = Compiler.compile(Source("<sqlite-native-object-expansion-except>", code), scope.importManager).execute(scope) as ObjInt
assertEquals(7L, result.value)
}
@Test
fun testSqlObjectExpansionRejectsUnannotatedComplexFields() = runTest {
val scope = Script.newScope()
createSqliteModule(scope.importManager)
val code = """
import lyng.io.db.sqlite
class Payload(name: String)
class BadRecord(id: Int, payload: Payload)
val db = openSqlite(":memory:")
db.transaction { tx ->
tx.execute("create table bad_record(id integer not null, payload text not null)")
tx.execute("insert into bad_record(@cols(?1)) values(@vals(?1))", BadRecord(1, Payload("x")))
}
""".trimIndent()
val error = assertFailsWith<ExecutionError> {
Compiler.compile(Source("<sqlite-native-object-expansion-bad-field>", code), scope.importManager).execute(scope)
}
assertEquals("SqlUsageException", error.errorObject.objClass.className)
}
@Test @Test
fun testExecuteRejectsReturningButSelectSupportsIt() = runTest { fun testExecuteRejectsReturningButSelectSupportsIt() = runTest {
val scope = Script.newScope() val scope = Script.newScope()

View File

@ -24,7 +24,9 @@ extern class SqlColumn {
Adapter interface for custom DB field projection. Adapter interface for custom DB field projection.
Use it with `@DbDecodeWith(adapter)` on class constructor parameters or Use it with `@DbDecodeWith(adapter)` on class constructor parameters or
class-body fields/properties participating in `decodeAs<T>()` projection. class-body fields/properties participating in `decodeAs<T>()` projection,
and with `@DbSerializeWith(adapter)` on fields participating in SQL object
expansion macros such as `@vals(?1)` and `@set(?1)`.
`targetType` is the requested Lyng target type represented as a runtime `targetType` is the requested Lyng target type represented as a runtime
type object, such as a class or a type expression. type object, such as a class or a type expression.
@ -41,6 +43,10 @@ interface DbFieldAdapter {
/* /*
Encode one Lyng value into a database field representation. Encode one Lyng value into a database field representation.
The result must be a direct DB-bindable value:
null, Bool, Int, Double/Real, Decimal, String, Buffer, Date, DateTime,
or Instant.
*/ */
fun encode(value: Object?, targetType: Object): Object? = fun encode(value: Object?, targetType: Object): Object? =
throw NotImplementedException("DB field adapter encode is not implemented") throw NotImplementedException("DB field adapter encode is not implemented")
@ -185,6 +191,29 @@ extern class SqlTransaction {
- Date, DateTime, Instant - Date, DateTime, Instant
Unsupported parameter values should fail with `SqlUsageException`. Unsupported parameter values should fail with `SqlUsageException`.
SQL object expansion macros are also supported:
- `@cols(?1)` expands one object argument to a comma-separated column list
- `@vals(?1)` expands the same object to matching `?` placeholders and
generated bind values
- `@set(?1)` expands to `col = ?` pairs and generated bind values
- each macro also accepts an optional `except:` filter, for example
`@set(?1 except: "id", "updatedAt")`
When a clause uses `@cols`, `@vals`, or `@set`, any non-expanded scalar
parameters in the same clause must use explicit indexed placeholders such
as `?2`, `?3`, and so on.
Projection is declaration-driven:
- `@Transient` fields are excluded
- `@DbExcept` fields are excluded
- `except:` excludes additional fields for that specific macro use
- `@DbJson` fields are encoded as JSON text
- `@DbLynon` fields are encoded as Lynon binary
- `@DbSerializeWith(adapter)` fields are encoded through the adapter
- unannotated non-primitive fields fail with `SqlUsageException`
*/ */
fun select(clause: String, params...): ResultSet fun select(clause: String, params...): ResultSet

View File

@ -278,3 +278,74 @@ Current recommended projection policy:
- then Lynon decode for binary columns when the target is not `Buffer` - then Lynon decode for binary columns when the target is not `Buffer`
- no implicit JSON decode for arbitrary text columns - no implicit JSON decode for arbitrary text columns
- fail on anything else - fail on anything else
## Write-side SQL object expansion
The symmetric write-side convenience should be explicit and declaration-driven, but it should not attempt semantic SQL analysis.
Agreed v1 surface:
- `@cols(?1)` expands one object argument to projected column names
- `@vals(?1)` expands the same object argument to matching placeholders and encoded bind values
- `@set(?1)` expands the same object argument to `column = ?` pairs and encoded bind values
- each macro accepts an optional `except:` filter, for example `@set(?1 except: "id", "updatedAt")`
Examples:
```lyng
tx.execute(
"insert into item(@cols(?1)) values(@vals(?1))",
item
)
tx.execute(
"update item set @set(?1) where id = ?2",
item,
item.id
)
```
Rules:
- once a clause uses `@cols`, `@vals`, or `@set`, plain sequential `?` placeholders are not allowed in the same clause
- non-expanded parameters in macro clauses must use explicit indexed placeholders such as `?2`
- the same object argument may be referenced multiple times
- object expansion is based on declaration metadata, not SQL metadata
- v1 excludes `@Transient` and `@DbExcept` fields automatically
- `except:` excludes additional fields for one specific macro use
### Write-side field encoding policy
Write-side encoding cannot rely on DB column type inference, so non-trivial field serialization must be explicit.
For each projected field:
1. if the value is already directly DB-bindable, bind it as-is
2. else if `@DbJson` is present, encode to canonical JSON text
3. else if `@DbLynon` is present, encode to Lynon binary
4. else if `@DbSerializeWith(adapter)` is present, call `adapter.encode(value, targetType)`
5. else fail with `SqlUsageException`
Direct DB-bindable values in v1:
- `null`
- `Bool`
- `Int`, `Real`, `Decimal`
- `String`
- `Buffer`
- `Date`, `DateTime`, `Instant`
This is intentionally stricter than decode-side behavior. On writes, there is no portable, reliable way to infer the intended target DB representation from SQL text alone.
### Adapter role
`DbFieldAdapter` is now symmetric by design:
- `decode(rawValue, column, row, targetType)` is used by `decodeAs<T>()`
- `encode(value, targetType)` is used by SQL object expansion
The adapter instance is captured in preserved declaration annotation metadata, not passed ad hoc at the call site.
Future task:
- consider warnings or lints for risky annotation captures such as stateful adapters or closure-capturing instances