The Payload Was There All Along: Cross-Runtime Protobuf Is a Lie

On paper, Protobuf is the ideal choice for cross-platform communication:
It’s fast, strongly typed, and designed to work the same across languages.

So when I wired up a Python producer with a Node.js consumer, using Protobuf and google.protobuf.Any to wrap payloads, I expected everything to Just Work™.

Instead, I got this:

Envelope has no payload

But it did have a payload. The bytes were there. They matched the schema. I packed the message correctly in Python. I decoded it correctly in Node.

And yet… nothing.

🛰️ What I Built

Python side:

  • Used rstream to send raw Protobuf bytes over RabbitMQ Streams

  • Wrapped payloads in google.protobuf.Any

  • Serialized with Envelope.SerializeToString()

Node.js side:

  • Used rabbitmq-stream-js-client to receive messages

  • Used protobufjs to decode the Envelope and unpack the payload

🧪 The Symptom

Even after decoding the Envelope, this is what I got:

{

eventId: [],

routingDomain: [],

payload: undefined

}

Not null. Not an empty object. Just... undefined.

🔬 The Root Cause: Different Any Implementations

Turns out, Protobuf's Any implementation is not truly portable across runtimes unless you take specific steps.

Here’s what Python does when you call Any.Pack(message):

  • It serializes the message into a byte array

  • It sets the .type_url to type.googleapis.com/Fully.Qualified.MessageName

But here’s what protobufjs in Node does not do by default:

  • It doesn’t register type_url resolvers

  • It doesn't automatically know how to unpack Any

  • If you decode an Envelope, payload is there — but it's a raw buffer, and you have to decode it yourself manually

✅ The Fix: Manual Unpacking

In Node.js, after decoding the Envelope, I had to do this:

const envelope = Envelope.decode(msg.content);

const peer = Peer.decode(envelope.payload.value);

There's no .unpack() method.
No magic like Python gives you.

You have to manually know what you're unpacking, and decode payload.value yourself.

Also, protobufjs doesn’t do anything with type_url unless you write the glue code yourself.

🧠 Lesson Learned

If you’re using google.protobuf.Any:

  • Python → Python? Everything just works.

  • Node → Node? As long as you decode manually, it's fine.

  • Python → Node? You'll decode empty fields unless you manually unpack.

  • ❌ And don’t rely on type_url unless you build a mapping and write a dispatcher.

🤯 Bonus Confusion: It Did Decode

Protobuf is happy to decode any binary into a message — as long as it's valid per the schema. So when I accidentally sent a string (like "msg-123"), it decoded that into a message with no fields set, and didn't even complain.

If you decode garbage into Protobuf, it will cheerfully return a garbage object.

🧵 TL;DR

  • Protobuf cross-runtime requires deep understanding of runtime differences

  • google.protobuf.Any isn’t truly plug-and-play — it's a pattern, not a guarantee

  • Don't assume Protobuf interoperability works without glue logic

  • Always log your decoded structures and validate assumptions

Previous
Previous

If You’re Using Cursor But Not DDD, You’re Just Auto-Completing Garbage Faster

Next
Next

Prisma Didn’t Give Me a Unit of Work, So I Gave It Arson