Last week the Buffer team descended on Singapore for our 9th Buffer Retreat. Ever since the first retreat back in San Francisco, I’ve tried to figure out ways to capture the special moments we share together as a team. Since we’re fully remote, these yearly get-togethers can be the only moments we share with others for another year.
At our previous retreat in Madrid last year, I added a GIF Booth to the ever growing arsenal of ways to capture the team together. Using a Raspberry Pi, Touchscreen and a Raspberry Pi Camera we captured some GIFs at the welcome drinks and the retreat dinner. They still get shared in Slack to this day.
For Singapore I wanted to build out something new while expanding on the idea, mostly making use of existing hardware that I’d have with me. This usually consists of my Sony A7rii, various lenses, an iPhone or two and an iPad. As an iOS developer, it made sense to do something using iOS and the iPad. I started looking at existing options on the App Store, before deciding to build something myself.
Hacking Together a GIF Booth
I could’ve easily used the iPad camera to record the gifs but decided it’d be fun to make use of my Sony A7rii. Besides, who can’t say no to 4K GIFs and a bigger challenge? Fortunately, Sony has a Camera Remote SDK available for iOS, which allows you to control various features of your camera as well as providing a streaming protocol for the cameras live view.
Using the SDK example as a guide, it was fairly easy to get the iPad connecting to the camera’s live view. Once connected, my simple GIF app I threw together started the streaming the Liveview and displayed the frames from the stream full screen on the iPad. I also added some UI to allow people to tap on the screen to initiate a countdown for the GIF recording to start.
After tapping the big GIF button, it started a countdown from 3 with some excessively big numbers before displaying a “be silly”, “dance” or some other random prompt to inspire their GIF pose. When the countdown hit 1, the app would send a request to the camera to start recording video. Then, after 5 seconds, the app would send another request and stop the recording. Then the countdown button would show, allowing people to start the process over again.
Live Feed of the GIFs
That could’ve been it. I could simply take the recordings and later on generate the GIFs to share with the team. But, I also wanted to hook it into our brand new Retreat app that I’m sure we’ll blog about soon. I thought it might be neat for people to see the GIFs shortly after they made them in the app. My thinking was that perhaps the quick feedback loop would trigger people to create more.
Instead of trying to transfer the 4K video from the camera to the iPad, I decided to record the Liveview being shown on the iPad screen. Once finished the app would save it, generate a low quality preview GIF and upload it to Firebase to share with the team. I had the screen recording all setup, starting and stopping at the right times and made use of NSGIF for generating the preview GIF. Everything was ready to upload it to the server.
All good, except then I realized the iPad wouldn’t have any connectivity as it’s connected via WIFI to the camera.
Working around Connectivity
Stumped, I started to look around for possible solutions. One was setting up a Smart Router to direct traffic to different places but that would require some tinkering at the venues. Instead, I opted to make use of another piece of hardware, the spare iPhone. I created another app which made use of Multipeer Connectivity to connect this uploader app to the GIF Booth app when it launched. Once the GIF Booth launched, the app would look for the uploader app and show it as an option to connect to. Once connected, it would navigate to the Liveview and show the GIF Booth UI. After a GIF was generated, the GIF Booth would send the GIF to the iPhone, which in turn would upload the GIF to Firebase.
While somewhat over engineered, the project was working smoothly and had allowed me to dig into new parts of iOS like Multipeer Connectivity that I hadn’t used too much before. Digging into new parts of iOS is something I try and do with most side projects I have, we’re unlikely to use Multipeer within the Buffer iOS app. But, sidepdrojects like this one allow me to explore new APIs as I did here.
The only thing left was to display the GIFs in the retreat app. We’ve had a busy few months working on Buffer, as well as other apps like the upcoming Reply app. Due to this, the Retreat app has been on the back burner a little so I wanted to ensure displaying the GIFs in it would take as little work as possible. We made use of a Chat library for internal chat while on the retreat, mainly used for organizing dinners and ventures out into the City. But, I decided it’d be super easy to add a new chat room that the uploader app would just post to with the GIFs. This meant all of the work would be within the Uploader app, and I’d avoid us adding any additional work to add support on both iOS and Android projects.
In this behind the scenes GIF you can see the iPad, Camera and the iPhone on the table as well as my laptop which was used to tinker with a few settings and also provided power to a few devices during the evening.
Wrapping Up
With a few minor tweaks at the Retreat dinner to adjust GIF length and the number of frames in the preview GIF, it all ended up working seamlessly. After it was all said and done, we had over 120 GIFs recorded. I’m still generating a few of them from the higher quality 4K footage but here’s a preview…
I’m hoping to tidy up the project and possibly release it on the App Store using the iPad camera as the source with the option to connect other cameras for more control. Would love to hear from you if you might find it useful.
Try Buffer for free
140,000+ small businesses like yours use Buffer to build their brand on social media every month
Get started nowRelated Articles
As part of our commitment to transparency and building in public, Buffer engineer Joe Birch shares how we’re doing this for our own GraphQL API via the use of GitHub Actions.
We recently launched a new feature at Buffer, called Ideas. With Ideas, you can store all your best ideas, tweak them until they’re ready, and drop them straight into your Buffer queue. Now that Ideas has launched in our web and mobile apps, we have some time to share some learnings from the development of this feature. In this blog post, we’ll dive into how we added support for URL highlighting to the Ideas Composer on Android, using Jetpack Compose. We started adopting Jetpack Compose into ou
With the surprising swap of Elasticsearch with Opensearch on AWS. Learn how the team at Buffer achieved secure access without AWS credentials.