Error: truncated buffer/TypeError: this.buf.utf8Slice is not a function
See original GitHub issueWhen I want to read an event received from Kafka with the following schema, at first I get the Error: truncated buffer exception and when I customize the JSON schema I get TypeError: this.buf.utf8Slice is not a function exception! I think something is wrong with JSON schema. I’ve provided both Avro and JSON schema for better debugging.
My Avro schema when producing event to Kafka:
{
"name": "ComposedEvent",
"type": "record",
"fields": [
{ "name": "SearchResult",
"type": [
"null",
{
"name":"SearchResultRec", "type": "record",
"fields":[
{ "name": "query", "type": "string"},
{ "name": "took", "type": "int" },
{ "name": "timed_out", "type": "boolean" },
{ "name": "hits", "type": "int"}
]
}
], "default" : null
},
{ "name": "detectedDuplicate", "type": "boolean" },
{ "name": "detectedCorruption", "type": "boolean" },
{ "name": "firstInSession", "type": "boolean" },
{ "name": "timestamp", "type": "long" },
{ "name": "clientTimestamp", "type": "long" },
{ "name": "remoteHost", "type": "string" },
{ "name": "referer", "type": ["null", "string"], "default": null },
{ "name": "location", "type": ["null", "string"], "default": null },
{ "name": "viewportPixelWidth", "type": ["null", "int"], "default": null },
{ "name": "viewportPixelHeight", "type": ["null", "int"], "default": null },
{ "name": "screenPixelWidth", "type": ["null", "int"], "default": null },
{ "name": "screenPixelHeight", "type": ["null", "int"], "default": null },
{ "name": "partyId", "type": ["null", "string"], "default": null },
{ "name": "sessionId", "type": ["null", "string"], "default": null },
{ "name": "pageViewId", "type": ["null", "string"], "default": null },
{ "name": "eventType", "type": "string", "default": "unknown" },
{ "name": "userAgentString", "type": ["null", "string"], "default": null },
{ "name": "userAgentName", "type": ["null", "string"], "default": null },
{ "name": "userAgentFamily", "type": ["null", "string"], "default": null },
{ "name": "userAgentVendor", "type": ["null", "string"], "default": null },
{ "name": "userAgentType", "type": ["null", "string"], "default": null },
{ "name": "userAgentVersion", "type": ["null", "string"], "default": null },
{ "name": "userAgentDeviceCategory", "type": ["null", "string"], "default": null },
{ "name": "userAgentOsFamily", "type": ["null", "string"], "default": null },
{ "name": "userAgentOsVersion", "type": ["null", "string"], "default": null },
{ "name": "userAgentOsVendor", "type": ["null", "string"], "default": null }
]
}
My JSON schema when trying to read produced events:
{
name: 'ComposedEvent',
type: 'record',
fields: [
{ name: 'SearchResult',
type: [
'null',
{
name:'SearchResultRec', type: 'record',
fields:[
{ name: 'query', type: 'string'},
{ name: 'took', type: 'int' },
{ name: 'timed_out', type: 'boolean' },
{ name: 'hits', type: 'int'}
]
}
], 'default': null
},
{ name: 'detectedDuplicate', type: 'boolean' },
{ name: 'detectedCorruption', type: 'boolean' },
{ name: 'firstInSession', type: 'boolean' },
{ name: 'timestamp', type: 'int' },
{ name: 'clientTimestamp', type: 'int' },
{ name: 'remoteHost', type: 'string' },
{ name: 'referer', type: ['null', 'string'], 'default': null },
{ name: 'location', type: ['null', 'string'], 'default': null },
{ name: 'viewportPixelWidth', type: ['null', 'int'], 'default': null },
{ name: 'viewportPixelHeight', type: ['null', 'int'], 'default': null },
{ name: 'screenPixelWidth', type: ['null', 'int'], 'default': null },
{ name: 'screenPixelHeight', type: ['null', 'int'], 'default': null },
{ name: 'partyId', type: ['null', 'string'], 'default': null },
{ name: 'sessionId', type: ['null', 'string'], 'default': null },
{ name: 'pageViewId', type: ['null', 'string'], 'default': null },
{ name: 'eventType', type: 'string', 'default': 'unknown' },
{ name: 'userAgentString', type: ['null', 'string'], 'default': null },
{ name: 'userAgentName', type: ['null', 'string'], 'default': null },
{ name: 'userAgentFamily', type: ['null', 'string'], 'default': null },
{ name: 'userAgentVendor', type: ['null', 'string'], 'default': null },
{ name: 'userAgentType', type: ['null', 'string'], 'default': null },
{ name: 'userAgentVersion', type: ['null', 'string'], 'default': null },
{ name: 'userAgentDeviceCategory', type: ['null', 'string'], 'default': null },
{ name: 'userAgentOsFamily', type: ['null', 'string'], 'default': null },
{ name: 'userAgentOsVersion', type: ['null', 'string'], 'default': null },
{ name: 'userAgentOsVendor', type: ['null', 'string'], 'default': null }
]
}
Issue Analytics
- State:
- Created 7 years ago
- Comments:5 (2 by maintainers)
Top Results From Across the Web
TypeError: this.buf.utf8Write is not a function - Stack Overflow
Solution: When avsc is used on the client side with webpack or browserify, one has to use require('avsc/etc/browser/avsc').
Read more >Error while decoding Avro data with NodeJS - MSDN - Microsoft
TypeError: this.buf.utf8Slice is not a function. Here's my code - const avro = require('avsc'); module.exports = function (context, ...
Read more >
Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free
Top Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
I also had this problem with kafka-node. I solved it by setting the encoding on the consumer to ‘buffer’. For example:
See https://www.npmjs.com/package/kafka-node#consumer
Thanks, I see. The problem is that
kafka-node’smessage.valueisn’t the right argument fortype.fromBuffer. For example it expects aBuffercontaining exactly a single encoded value but AFAIKmessage.valueis a string prefixed by a header (see https://github.com/mtth/avsc/issues/13).The best way to handle this would probably be to implement some kind of schema resolution logic (retrieving the
typefrom the header) but in your case I believe you should be able to get a quick solution working as follows:You might also want to take a look at https://github.com/mtth/avsc/issues/22 for more information on Kafka messages.