Farcaster Frames was released to much fanfare yesterday, and on the surface the deceptively simple idea has already yielded incredible creativity, with everything from MUDs, to polls, to NFT mints. And the number of folks ideating over the potential was even larger. Then along came this post:
Challenge accepted.
So naturally of course, people learned two things very quickly:
Frames can be animated
Q has a metavm project
And boy did that generate some questions. So what I'm setting out to do in this article is circle back on my post from three days ago, where I spoke about showing what the future of computing and crypto will look like very soon. For those who got to try the demo out before the VPS host piping the command data over decided to nerf the vCPU and not respond to tickets (Big shout out to Vultr, you're a garbage service), congrats – you just got your first taste of that future I was describing.
So in this article, I'm going to describe how it works, and how I was able to build it in only two hours.
Farcaster Frames use meta tags in a style very similar to opengraph, enhancing it with custom button texts and submission urls that can be signed by users on Farcaster, presenting a basic loop:
That frame data includes an image url to display, which is expectedly rendered in an <img> tag. But browsers support a lot of image formats, and so I remembered an old trick used by webcams back in the day: MJPEG. With MJPEG, you can hang onto a request indefinitely until the user closes it, and send back raw JPEG images one by one. This meant I didn't need special video support, I could just use an endpoint that returned MJPEG as a response for the frame image url.
Now I just needed to test it, so I built a simple Next.js API handler that kept an open connection and streamed to all active connections the same frame as it was rendered. I started simple, and just made a setInterval loop that loaded six images from local fs and sent them every 500ms:
import type { NextApiRequest, NextApiResponse } from 'next';
import { join } from 'path';
import { createCanvas } from "canvas";
let frame = 0;
const clients: any[] = [];
export const images = [
fs.readFileSync(join(process.cwd(), 'images/image1.jpg')),
fs.readFileSync(join(process.cwd(), 'images/image2.jpg')),
fs.readFileSync(join(process.cwd(), 'images/image3.jpg')),
fs.readFileSync(join(process.cwd(), 'images/image4.jpg')),
fs.readFileSync(join(process.cwd(), 'images/image5.jpg')),
fs.readFileSync(join(process.cwd(), 'images/image6.jpg')),
];
export default async function handler(this: any, req: NextApiRequest, res: NextApiResponse) {
var headers: any = {};
var multipart = '--mjpeg';
headers['Cache-Control'] = 'private, no-cache, no-store, max-age=0';
headers['Content-Type'] = 'multipart/x-mixed-replace; boundary="' + multipart + '"';
headers.Connection = 'close';
headers.Pragma = 'no-cache';
res.writeHead(200, headers);
const ref = {
mjpegwrite: (buffer: any) => {
res.write('--' + multipart + '\r\n', 'ascii');
res.write('Content-Type: image/jpeg\r\n');
res.write('Content-Length: ' + buffer.length + '\r\n');
res.write('\r\n', 'ascii');
res.write(buffer, 'binary');
res.write('\r\n', 'ascii');
},
mjpegend: () => {
res.end();
},
};
var close = function() {
var index = clients.indexOf(ref);
if (index !== -1) {
clients[index] = null;
clients.splice(index, 1);
}
};
res.on('finish', close);
res.on('close', close);
res.on('error', close);
clients.push(ref);
}
export const mjpegsend = (buffer: any) => {
for (var client of clients)
client.mjpegwrite(buffer);
};
setInterval(() => {
mjpegsend(frames[frame]);
frame++;
frame %= 6;
}, 100);
Loading the API handler url in the browser worked swimmingly. Ok, great, so then the next challenge: make it run Doom.
Luckily for me, I already solved this problem in the research around Quilibrium. One of the advanced features that will be launched later this year is the metaVM, which translates instruction set architectures into an executable format usable by the network, along with many other important components to support a fully functioning VM. The metaVM supports a basic framebuffer device which is IO mapped to RAM at a specific location. The VM translates to a choice of execution calls: durable – on the hypergraph, and therefore somewhat slower, or ephemeral – does not store execution state, and is merely piped over. Supporting keyboard and mouse inputs work similarly with hardware interrupts. Finally, the file system itself is fulfilled with a virtio-9p compatible application, which translates R/W requests for inodes into hypergraph calls. Together, you get a fully distributed virtual machine with optional durability at multiple levels. This, despite sounding rather complicated, is quite simple to implement on Q, and looks a lot like a traditional emulator when you dive into the code.
So then the remaining tasks became only the following remaining items:
Handle inputs from the buttons and send them back as key down/key up events
Build the framebuffer worker and kick it off on start of the Frame server
Convert the framebuffer data into JPEGs
Deploy a filesystem map compatible with the metaVM virtio-9p driver with Linux with Doom to the hypergraph
Execution state updates with the RPC client can be streamed directly from metaVM, so we carved it out to the section of RAM containing the framebuffer, and directly invoked the interrupts for keyboard inputs. That's the first two down.
Node doesn't have a clearcut way to quickly convert buffers to JPEGs, and I wanted to hack this together quickly, so I used node-canvas to serve as the render target for the raw image data, then used canvas.toBuffer('image/jpeg') to create the image. Publishing the buffer data over the worker, the message handler on the API side then only needs to directly call the mjpegsend(buffer) method defined above. Next one down.
For the last one, I had a bit of a cheat here, in that I already built this filesystem map a while ago to demo metaVM (hi friends on Unlonely!) and the QConsole. That's all the work needed done.
Architecturally, the frame integration then looks like this:
And there you have it: Doom on Frames.
Over 2.2k subscribers
obligatory: https://paragraph.com/@quilibrium.com/doom-on-frames
IMPORTANT ANNOUNCEMENT: We can play @cassie's DOOM again! It seems MJPEG streams started working again - maybe when WC switched to CF for image proxying? What's next? ScummVM port? Remote controlled robots/webcam streams? Tx @samuellhuber.eth for making me doublecheck this
How it works: https://paragraph.xyz/@quilibrium.com/doom-on-frames
Turns out it only works on web, not on mobile 😭
ScummVM 👌
It's possible to create frame where we see real time sports score data?
Yea but only on web it seems 😮💨
ScummVM already works, it's just really limited because you only get four buttons
Yeah I figure we can only do the early games, right - 4x cursor keys and text entry (any text input + button == enter). I think that was the "AGI" engine.
got a game in mind?
hell yeah
@remindbot 3 days
@alecpap reminder set for July 4, 2024 at 4:37 PM UTC. view it here: https://degenapi.com/remind-bot/id/3
@alecpap this is your reminder. view it here: https://degenapi.com/remind-bot/id/3
For quite a while, we were able to upload gifs up to 15mb. It seems within the last week this has changed back to 10mb. I’ve tested this with multiple gifs previously worked.
👀
new with frames, what are some of the best ones I should try out?
If in the mood for degeneracy https://warpcast.com/tybb/0x9ee1b16a
Create an e2e verifiable Frame poll with Farcaster.vote!
We categorize more all popular Frames here: https://www.degen.game/frames/featured
oh wow, this is a great starting point. Thank you!
Love this 1 $degen
/card is a really good demonstration of the tech (invite only, but if you can get one definitely check it out).
Find.farcaster.info
If you're on web (this one doesn't work on mobile), how about Doom? https://frame.quilibrium.com/polls/1
This is really cool, first time interacting with a frame like this
this is actually the first time i've been able to play your Doom frame. THANK YOU!
amazing! curious what are the limitations around mobile that are diff than desktop for frames?
this frame relies on a really specific trick with browser support for MJPEG in image tags, as frames do not support video. More details here: https://paragraph.xyz/@quilibrium.com/doom-on-frames
Make a meme n publish as frame via app.poster.fun in one click
Yo @blockheim, try out Base Name Service Frame App. https://frame.basename.app/api
we just launched one! https://caddi-hunt-frames.vercel.app
building this onboarding one plz 88 $degen https://warpcast.com/jpfraneto/0x60320267
Doom with live video streaming inside a frame. Wow. Amazing read - https://paragraph.xyz/@quilibrium.com/doom-on-frames
Who is doing the most innovative things on farcaster? Tag 👇
I am using Farcaster Frames for generative art mints! https://www.freeformselect.com/
Brett Scrimblebottom
i thought "frameceptions" are reaalllly fkn cool when i stumbled on it the other day: https://warpcast.com/sartocrates.eth/0x76ea2a2d
https://warpcast.com/0xsmallbrain/0xb597a44a
this blew my mind when i stumbled on it the other night
Uhm @nounishprof and @adrienne with their @gmfarcaster show 💯
Come check us out over in /gmfarcaster channel - latest episode from yesterday pinned to the top! And thanks @rachelw 💜
Frames is the thing that made Farcaster pop last week. TLDR: cast can now contain mini-apps and people are building insane stuff on it (chess, shopping, DOOM) https://docs.farcaster.xyz/reference/frames/spec
Wait who made DOOM?
@cassie put DOOM in frame
Oh, wait I just realized you asked "who" not "what", nevermind
Not necessarily the most innovative thing, Casters have a lot of talent! but for our community, it's a real symbol. The Fraternal Charter of our project /thecryptomasks is immortalized and available in limited edition only on /frames https://warpcast.com/thecryptomath/0xee86ea0f
/greecaster is the thing 🔥🔥
Buongiorno dear Cozomo 🤟🏻 Just subscribed, let's see how this works out 👀
Direct gen-art Base mints in Frames are super fun and such a different experience than what we're used to. Manifold is supporting a lot of this tech on their end as well.
This is not a frame direct mint, but @nicole is doing some beautiful work. https://app.manifold.xyz/c/the-moon
tysm @smol !
We're back! https://warpcast.com/cassie/0xc329ea28 https://frame.quilibrium.com/polls/1
whoops, forgot to update the endpoint update matching, one sec
the improved framerate is revealing that we need an action button, @v pls let us have more buttons 🥺
Reminder for folks who missed it the first time around: this does not work on mobile, visit it on web
💪⚡️
yes! i finally got to see it in action (o.o) amazing!
very nice
my favorite frame so far tbh
Of all the frame experiments that I've seen, this is the one I'm still thinking about days later. Would love to hear/read a full run down of how you felt this experience went, as well as the edge cases you came across in the process. Its a huge learning experience for many & your insight has a lot of value Cassie
Already wrote about it 🫡 https://paragraph.xyz/@quilibrium.com/doom-on-frames
insightful as always🙏
Hey there! wanted to quickly share some info about Open Frames - it’s an interoperable standard that expands on the Frames spec. It would be awesome to add you as supporters and list you in our awesome repo! https://github.com/open-frames/awesome-open-frames/
@launch Gaming onframe
You scouted @cassie’s launch! https://www.launchcaster.xyz/p/65b74e5d2cccdd9d36ea826c
me: wow, I made the picture show a little flag! @cassie: https://paragraph.xyz/@quilibrium.com/doom-on-frames
me: guess the only way to solve this one is to pay $900 to Vercel @cassie:
333 $DEGEN
NGL this workflow chart is way underrated alpha. 69 $degen
Built different
How did I put Doom on Farcaster Frames with only two hours of work? I told you I'd show you what the future of crypto looks like very soon, this was just the teaser of what abandoning the blockchain looks like. https://paragraph.xyz/@quilibrium.com/doom-on-frames
This is incredible
This is amazing! I just discussed with with my teammate and I think we can implement something like this. Any help with getting into Qulibrium or source code would be amazing. I am going to take a look at the docs. 10 $degen!
Gray read 500 $degen
amazing, what are the limitations here, though? how many image frames can I send? or is there a timeout?
tip 69 $degen to @cassie heck ya
Oh so you were the dev who did this. Awesome ty for this write up!
I was today years old when I learned about the MJPEG image format. Pretty cool! https://en.wikipedia.org/wiki/Motion_JPEG
Amazing.
hey loved what you did there, thanks for sharing! dumb quenstion, was playing with your code sample, what the client side looks like within nextjs? using the api route as img source within nextjs make it interpret it as html and 404
nevermind fixed it, had nothing to do with client side, had to update route code
So cool!!