Scribble on iPadOS, a test

There is something incredibly magical about using the Pencil on iPados14 with Scribble. This is a short post written entirely in Scribble using iawriter. There is certainly a little bit of a learning curve ( how do I capitalize letters?) but even in just writing this post I’m learning them quickly. What feels like magic is using an Internet connected notebook which is hard to describe as I get used to it. I’ll keep using it and report back if it becomes a major input method for me as I continue my iPad journey!

Pro Tools

I spend money on software. I think developers deserve to get paid, the products I pay for are a higher quality and that I would rather pay with dollars than something else (attention, eyeballs, data being shared). I find a lot of people, even technologists, still complain about software prices. When Apple first announced the iOS App Store they pushed prices to the bottom and we’re finally seeing more fair prices return. Here is a list of my favorite Pro Tools on my iPad.

Anything else you recommend? Hit me on Twitter

Pencil Kit 2020

With a few days to digest WWDC, I’m excited about the upcoming iPad changes even if they didn’t complete blow me away. Apple announced a handful of quality life changes (enhanced apps, sidebars, translations, safari, search, siri, shotcuts) that won’t necessarily change the way I use my iPad but will make it more pleasant and intuitvie to use. It’s clear to me the iPad’s big 2020 moment was the iPad Smart Keyboard (which I continue to love).

They did sneak in some new things to PencilKit that excite me; both the new features and the greater idea that they are still pushing it forward. The two big PencilKit changes are around handwriting and shapes.

When I use my iPad in Tablet mode, I could also call it Creative Mode. Even if I’m in a lean back reading mode, I have a pencil in hand, I’m taking screenshots, cropping, excerpting and taking notes. In that mode if all of a sudden a message comes in I need to respond to, I am presented with a huge iPad Keyboard. I tried the floating keyboard but found it just as awkward to type with one hand. My solution was to install SwiftKey (remember that) so I can at least drag my fingers over the huge keyboard to spell out words. If I have to write something long I attach the keyboard or grab my phone but I can easily imagine using handwriting for short 1-3 word responses that are likley my primary use case. To have universal handwriting input just makes creative mode that much more smooth. The other features for handwriting, including cut, paste, move and changing colors are the icing on the cake that makes my notes+pencil apps much more useful.

Finally, I’m excited to try the way shapes snap into place if you draw and hold on them. A number of other apps have embraced this already (I first saw it with Adobe Comp) and it can be very powerful when I remember it exists. Having it as a system default should help with my memory. I plan to use this to do more hand-sketch wireframing, which I do now for ideating but typically have to move over to another app for clarity when I share it with people.

On that note... why is their still not a Pro wireframing app for iPad?


I received an invite to Hey today from my friend Oz Lubling. He first shared with me the video yesterday and I was floored. What a thoughtfully designed piece of software. It’s maybe the best example I’ve seen of ethical software. It’s design to make you feel more in control and reduce the amount of stress software can impose on us. I also think its inspiring for someone to question the way a 30+ year old technology works and re-imagine it. It’s not re-inventing emails. It’s managing email in a way more modern, efficient and calming way. Oz told me over text that more software will start to work this way because once you see it, it can’t be unseen. I agree. I couldn’t sleep last night because I kept thinking about what software can be. I have a lot more to say on that subject. Until then go subscribe to Oz’s Newsletter for his insights and links on product design, ux, apps and culture.

iPAD WWDC Wishlist

With WWDC right around the corner, I thought I’d write about my iPad wishlist. It’s a short-list and not something I’ve been thinking about for months like my favorite podcasters.

#1 - Better videoconferencing support. If this is the only thing that comes true I would be quite happy. This breaksdown into 3 wishes. First, I want to background my videoconferencing app without my camera turning off. I apprecaite the level of security this initially provided but in a remote-first world it makes little sense. I want to be smiling at my co-workers and then pull up a doc in full-view, split-view or pop-over without my video turning off. It’s so basic and the reason I grab my Macbook Pro for one or two meetings a week (for most meeting I don’t care if I go dark but sometimes its important to not). Second, I want to use video-conferencing apps (Zoom/Meet) in split-screen and/or pop-over. I enjoy the iPad multi-tasking system and sometimes want access to notes, calendars, spreadsheet and other reference material while I am meeting someone. Finally and likely the most difficult change is the awkward camera angle. The iPad has its camera on top in Portriat mode which means when you are working in landscape (always) and you jump on a video-conf you are never looking at the camera and your gaze is distinctly different than everyone else who by looking at the meeting are centered and making eye-contacting. This feels like a difficult change without new hardware (which I would buy immediately) but there is likely a way with machine vision algorithms to make this situation better. Another acceptable solution would be support for external webcams, especially the holy grail of using your iPhone as your iPad’s camera. I sometimes will call in from an iphone so I can take notes on my iPad which works fine until the moment someone shares their screen and we are all looking at a spreadsheet and I’m trying to decipher it on my phone’s smaller screen (even when I have Apple’s biggest phone screen).

#2 - Pinned Apps in Split View. My most common multi-task setup is having a primary app in 3/4 screen view and another in 1/4 view. It’s actually my favorite multi-tasking view on any OS I’ve ever used. Apps looks great in 3/4 view and like iphone apps in 1/4 view and I think its perfect. I use slide-over for quick reference apps, like calendars or todo lists that I want to pull up for 30s and then dismiss. I wish I could mark any of my Split View screens as pinned so that when I then launch another app, it automatically goes into the non-pinned view. As a use case I might writing a document in 3/4 with Safari open in 1/4 for reference and then decide I quickly want to open Slack and ask someone a question. I hit cmd-space, search for Slack and hit enter and then slack takes over the entire screen, where I would have preferred it just poped into 1/4 and not take my focus away from the main document I’m working on.

#3 Keyboard Shortcuts / Better support for Split View. There is just so much that can be done here but essentially when I open an app I would love for a power-user feature where I can assign it to split-view of pop over. I would use a Shortcuts feature that allowed me to pair to apps to open, like open my calendar and todolist side by side.

#4 Chromium Support. I know there are security reasons that Apple doesn’t allow 3rd party render engines but the fact is some things on the Internet still only work in Chrome. In the big picture its uncommon but in the world of Shopify you will find many Shopify apps that don’t behave properly unless accessed with a Chromium browser. One of the reasons I use Screens to access my Macbook Pro a few times a week.

#5 Volume Keyboard Shortcuts for Magic Keyboard. This is my least important wish and the one that will likely come true. I listen to a lot of music, take a lot of calls, etc and volume controls are important. At first I didn’t fuss to much when the Ipad Magic Keyboard didn’t have volume controls but when I switch back to an external keyboard I find myself using them frequently.

Above all my top wish is that iPadOS continue to receive love and care from Apple. They’ve done a fantastic job building out the OS over the last few years and they make great decisions. I’m just excited and anticipating whatever it is they will announce. When my friends use the adage “all Operating Systems suck”, I’m always quick to respond, “well not iPadOS”.

Transient Notes on the Agenda

I’m trying to be more deliberate about my Transient notes. One step is to reduce the overall number of them by focusing on more permanant notes that accrete. Transient notes, which are notes used as scratchpads or for very short-term memory are useful tools, however my work should build upon itself and grow more useful over time. I fear I spend too much time focused on creating and managing to transient notes and not enough time building a long lasting repository of ideas and knowledge.

Another way I’m being more deliberate about my Transient notes is to segment off meeting notes. Meeting notes are important and I often review them a few days after creating them but I never review them after that. They wind up being Transient notes in a seas of Transient notes. I’ve just switched over to Agenda specifically for Meeting Notes. I’m prepping for meetings by writing notes for meetings in the future and I’m creating real-time notes during meetings. I like having these meeting notes stored in a seperate system and tied directly to the meetings they were created in. I triage my meeting notes, usually into todos or stories and now I’ll convert any useful insights into permanant notes with tags to be surfaced again sometime in the future.

Thinking about notes

There are many theories of productivity around task management but not as many around note-taking. I recently stumbled upon 20th-century German sociologist Niklas Luhmann’s Zettelkasten system. I still have a lot to learn but he wrote complete ideas down on index cards, tagged them in a way that connected them to related ideas and stored them all in a box. From there I stumbled into Andy Matushak’s Everygreen Notes and the entire web of his site.

One thing about Matushak’s work that stuck out for me was the concept of Transient Notes.

Most people use notes as a bucket for storage or scratch >thoughts. These are very convenient to write, but after a year >of writing such notes, they’ll just have a pile of dissociated >notes. The notes won’t have added up to anything: they’re more >like fuel, written and discarded to help the author process >their ongoing experiences.

I’m trying to develop my own system of note-taking and looking for any other systems for inspiration. If you have any suggestions hit me up on Twitter.

BUILD A Better Future

If you identify as an entrepreneur and want to contribute your skillset to building a future with more diverse leadership, I highly recommend you take a look at a Not-For-Profit organization called BUILD. BUILD is an entrepreneurship program for underserved high school students that teaches them how to build their own business while becoming the CEO of their own lives.

This is more than sparking the love of entrepreneurship for these students, although I’ve met many in the program who have told me they plan to start their own businesses after college. It’s a 4 year program where students learn teamwork, leadership, self-confidence, communication, grit, problem solving and self-management.

Does it work? I’ll let the metrics speak for themselves:

BUILD has chapters in NYC, D.C., The Bay Area, and Boston.

If you are new to volunteering and find the prospect uncomfortable I can assure you that the BUILD team makes it very easy to participate. They are friendly, helpful and just as encouraging to volunteers as they are to the student. Just find your local chapter, email them and let them know you are interested in volunteering and they will take over from there.

I also encourage you to read a letter from BUILD CEO, Ayele Shakur. “Together we can ReBUILD America to ensure a brighter future for ALL.”

Read Later Apps

Back when Instapaper was first released mobile networks were slow and in many places non-existant. Podcasts and streaming services were in their infancy, phone screens were small, mobile sites were terrible and subway cars were completly offline. It was really great to have an app that would save and download articles while cutting out the web cruft so that you could read and enteratin yourself when you next found yourself in an offline state.

When Instapaper stopped innovating there were many others of which Pocket and Reeder 3 were among my favorites. They iterated on the old paradigm I describe above but I have found ym own needs and the environment around them have both changed.

I now favor Read Later clients which operate more like bookmark managers. Two of my favorites are Keep and Abyss. I had discovered this preference change when I first tried Abyss which was so perfectly simple. Save your articles and they appear in a tableview. Tap on them and they open a webview. I realized I didn’t need articles to be offline (especially the last few weeks), my screens are all quite big and mobile sites can be pretty good now and stripped out all the “crust” robbs the articles experience. The other major benefit is that you can save non-articles, say the website for a new tool you want to check out, and they are treated as any other read later content. The one thing that bothered me about Abyss was that on very large screens like my 12.9” iPad Pro the tableview showed an overwhelming number of links on my screen.

MacStories wrote this week about Keep and I’m all in (and went Premium). The free version is similar paradigm to Abyss but looked better on big screens and has a few quality of life enhancements. However the major feature of premium which I just started experimeting with is that it can read out loud your articles. It uses Googles voice APIs and it sounds pretty good. My biggest question is if I can fit any more audio into my information diet right now.

Moshi iVisor Update

A few weeks ago I put a Moshi iVisor on my iPad. My very first impression was complicated. A naked iPad Pro screen is a beautiful thing; 120hz and the most vibrant and colorful device I’ve ever used. Putting an iVisor on it immediately turns it into a matte display and makes it feel more like a laptop. On the other hand, 5m after putting it on I was able to use my iPad on my balcony in sunlight.

My update is that after a few weeks I like it and have no plans to take it off. As the weather is getting better the ability to use my device outside is incredibly valuable. I have totally adjusted to the matte look and feel and find it preferable in many scenarios (like watching a movie with most of the light off). One of the unexpected benefits is that it doesn’t collect fingerprints. A naked iPad screen notoriously collects your fingerprints and in retrospect I realize how gross that is. Some people use the iVisor to add a little bit of texture and make the screen feel more like paper. I can confirm that it provides a little more fiction but I prefer the feeling of Pencil on screen. The dulling of the colors and the extra texture are both a bummer for creating my art. If anything makes me bail on the iVisor, it will be if I feel like it is affecting my art.

However for now, I look forward to using my iPad this weekend on my balcony and for some weekend soon when I use it in Central Park.

Mindlessly Browsing with Purpose

The last few months I’ve been moodboarding content from the internet. I created a new board for each week and saved every image that captured my attention that week. Two weeks into this practice I created one of my favorite paintings to date. 2 nights after I made that painting I was flipping through my mood board and saw a direct connection between what I was saving and what I had made. It solidified my conviction in moodboarding even if I found the tools a little janky. I tried using Notability, GoodNotes and VizRef they never lived up to my expectations. It was the primary reason I begged into the Muse beta when I first discovered it.

Now that I have a tool I like I’m saving even more things. Then after stumbling upon the concept of a Commonplace Book I started to save quotes, text, diagrams and in some cases entire websites.

It has inspired me to read more. I like to read but when I come across some insightful SaaS stat, I archive it soemwhere in my brain. I’ll draw connections from it but could never really reference the source material. I had no way of browsing all the insightful SaaS stuff I’ve come across. When somoene asked me what SaaS free trial to paid conversion % should be, I could never come up with a definite answer. All of that makes reading some nebulous input into the creative process and frankly pushed me away from reading industry specific things. A lot of people say a lot of things on the Internet, if I’m not saving the best stuff its all just sitting in the same buffer waiting for garbage collection.

Now that I’m starting the practice of a Commonplace book I’m trying to read more in my free time to fill up the book with insightful stuff for a later date.

moodboard and my painting

My painting in the bottom right.

Deepwork Today

To follow up on yesterday’s post, today I made time for some Deepwork. I woke up early (6AM), made coffee, put on some Kid Cudi, grabbed my iPad and opened up Muse. I knew I had 2.5 hours before the world woke up on me and a juicy product design problem that I had been putting off.

Muse connects with how my brain works and how I effectively used paper notebooks for ideation back in my Creative Director days. There is no chrome, a few intuitive gestures, a card system, handwriting, text support and the ability to bring in outside materials. This morning I had an Airtable open and was able to pull in text from our content team’s schedule directly into Muse, where I was sketching, flow-charting and ideating.

Flow works best when your tools get out of the way; when you are not clicking around or trying to figure out the interface. I knew I was in the zone when connections just started getting easier and easier. When a new unanticipated problem emerged, I immediately knew how to solve it. Using the Pencil and my own handwriting helped in the way that sketching in a notebook does.

After an hour or so, I had sketched out the solution and the way I wanted to communicate it but it was in my unreadable handwriting. I then started replacing my handwriting and sketches with text cards and photoshop piece by piece. By the end I had something that connected with the team a few hours later.

Its not yet my goal to achieve flow every day but it is my goal to flex my creative muscles daily. I’ve started my Commonplace Book and will talk more about that soon.

Personal Development: Creativity

I have been doing a lot of personal developement around Creativity the last few months.

It started with painting. While others were panic buying toilet paper, I started to stock up on art supplies. I knew, I would have a natural coping mechanism and have long used painting as a technique for mental health. It’s an incredible feeling to create something. It’s a solitary space to let my thoughts wander while at the same time be full of intense focus. It’s a project where I’m the only stakeholder and its a safe space to experiment, make mistakes and fail. I try to make time every day to paint.

A few years ago I used painting as a way to cope with work. At work there are many stakeholders and I needed to be less precious about my ideas. I enjoyed having a mental space where I was 100% in control and without commercial interests. I found that it allowed me to leave the artist at home and make better business decisions. I fell in love with a kind of creative freedom that is difficult to impossible to achieve with a team of stakeholders.

Running Product at a startup studio is a wild experience. Instead of thinking about one company with a small group of stakeholders, I’m currently working on 4 companies nearing launch, 3 more in the early development stage, a handful of early experiments, 2 portfolio companies that have spun out and stakeholders for each. I sometimes fall back on parking my creativity in favor of raw execution. It requires a lot of trust in the people around you and doesn’t give you the affordances of Deepwork or obsessing over a single problem space. I don’t have a singular enough focus to have shower ideas.

My next personal development project is to bring creativity back into my product development process. I’m at the early stages of reading and cataloguing things that have worked well in the past. I’m better at managing my time and making the space for creative work. I hope to develop a practice of deepwork.

I’ve tried so many tools, systems and apps the last few weeks and I finally feel like I’m settling into the next stage of my process; making the time, practicing and flexing my creativity muscles in the workplace.

Modular Computing- From Laptop to Tablet

When I first switched over to iPad, I did so as a laptop replacement. Fewer distractions, managing communications via notifications, best in class apps, a 120 HZ display with an a great animation system and having my artpad in my pocket are all topics for future posts. However, the promise of a modular computer that would have me switching between a laptop and a tablet was starting to feel like a false promise.

This changed with the iPad Magic Keyboard. The way the keyboard elevates the iPad, it just begs you to grab it and go. The magnets connecting the keyboard and case are strong in a way that makes it satisfying to both take the iPad off the case and to put it back on. Without question, I’ve been using it more in tablet mode since getting the new keyboard. Reading, conference calls and taking handwritten notes are all activities where I’m likely to pop it off the keyboard.

I belive that my tablet time is about to increase again. I received my invite for the beta program of Muse and my mind is blown. This feels like how a tablet was always meant to be used to organize ideas. If you haven’t seen Muse I encourage you to look at the hero video on their homepage. If that wets the appetite, go check out their Interface Handbook. This is the tablet experience I have been looking for and I’m so excited with my Day 1 results. I’ll write more about Muse in the near future as I integrate it into my daily workflow.


I love music and listen to it any chance I can. I’m listening to Juice WRLD right now. I wake up every Friday excited about the new hip hop album releases. I spend Saturday mornings going through them. I use music on my commute to get my mind right for the day ahead.

Because of this I’ve always found it very hard to hop on the podcast bandwagon. In January when I first started going iPad first, multiple people suggested that I listen to Adapt and I was hooked. Not only do they help shift my mindset into a new way of computing but host Federico Viticci is the ultimate vibes. From there I found out Viticci is on another show called Connected. More Vibes. Then I realized they were both shows and tried other shows in the network including Flashback, The Test Drivers, Under the Radar, Cortex, Presentable and Upgrade. Then I found out they had two art podcasts and now listen to MakeDo and Pictorial. Then I added 2 more Viticci podcasts which are technically part of MacStories.

So now I listen to podcasts but almost exclusively podcasts. On their own they are excellent, high quality shows that have re-kindled my love of technology. As a whole it is an extended community of podcasters, tech thinkers, developers, designers and artists showing up in each other’s shows, blog posts and twitter feeds.

I subscribed to both and MacStories to support their excellent work. I get more value from them than I do Netflix. When you subscribe to you get access to their Discord which extends that community even further. It reminds me of the pre-Facebook Internet where we hung out online with strangers who had shared interests and traded tips, culture, media and knowledge. It has been especially welcome the last few weeks where human contact has been infrequent and the post-facebook Internet appears locked into an outrage-fueld culture war.

Screenshots on iPad

I take a lot of screenshots for work. Along with saving inspiration and moodboarding, I’ll screenshot visual bugs or things I want to edit in both our web and native apps. I love the built in screenshot annotation tools that let me screenshot and then illustrate my concept without even using antoher app. This is an enhanced experience on the iPad where I can use the pencil to more accurate draw all over the screenshot and better illustrate my thoughts. Sometimes if I’m trying to do something even more complex, I’ll bring the screenshots into another tools for further editing most commonly Photoshop of Magic Eraser. I typically spend the end of every few days going through my camera roll and deleting all the screenshots after they’ve been shared or saved in anohter tool because I dont’ like a cluttered camera roll.

I’ve been doing it even more since getting the Magic Keyboard and discover that the Mac shortcuts Cmd-Shift-3 and Cmd-Shift-4 work. It’s Cmd-Shift-3 to take a quick screenshot and Cmd-Shift-4 to take it and open it in the native annotation tool. I also like how you can take a screenshot of an entire webpage, not just what fits on your screen, with a toggle in the annotation tool.

I still take screenshots when using my Mac but annotating and processing them afterwards is more awkward. In fact sometimes I just send them to my iPad so I can draw on them.

iPadOS + iOS

It probably wouldn’t surprise you to find out that workflows on iPadOS and iOS are better integrated than MacOS and iOS. iPadOS was derived directly from iOS and most modern iOS apps run on iPadOS. Many app run on all 3 platforms and that continues to grow with Catalyst now making that even easier but the interaction paradigms can be very different on MacOS. An app like GoodNotes on MacOS feels like such a bizarre experience where you can reference your handwritten notes but content creation makes little sense. Even apps that I think are very good on MacOS, such as Fantastical, just feel different and not as useful as their iOS/iPadOS counterpart.

The end result is a more fluid and less cognitively intense experience when moving between iPad and iPhone. Want to check your todo list? It’s the same icon on the same homescreen and when you tap it an identical experience. CloudKit is so good now that you can pick up your work instantly on the other device.

This was especially great when my days included meeting someone for coffee and just generally moving around a space. I could stop for groceries and if a work emergency came up I never thought “I have to get back to my Mac”. That said I’ve been noticing the seamless transition more while at home where I an more frequently shifting between work mode and home mode and the biggest difference between my computing devices are a keyboard, trackpad and the size of the screen.

When I switched to iPad

Yesterday one of my posts was featured in Oz Lubling’s Culture Clash alongside one of Khoi Vinh’s posts which is the perfect opportunity to talk about the day I switched to iPad. It was Khoi’s fault.

I had an iPad Pro 9.7 since 2016, which I purchased for the smart keyboard and Pencil. However it was many years before iPadOS and I could never grok the drawing tools at the time. So this iPad had been hanging around my apartment for a few years as a consumption device and I thought that I probably wouldn’t buy another tablet.

Then in November 2019, with the help of Khoi Vinh, Adobe released “full” Photoshop for Ipad. I didn’t rush to try it, as I had read a lot of reviews saying it wasn’t ready for prime-time use.

To add additional context, I paint... a lot. I painted this morning. Its an excercise in creativty, control and mental health. I have an easel, canvases, acrylic paint and painting has become my most cherished hobby. I don’t always have access to my supplies if I’m traveling and I started to look into the near future where I realize how much more difficult it will be to paint when my baby arrives. In fact every Dad I meet says, “ahh you paint every day, wait until you have kids”. T minus three months.

So one Saturday I picked up the iPad Pro, downloaded Photoshop, grabbed my Pencil and decided to see what it could do. A few minutes later and I created this

First iPad Painting

I fell completely in love. I realized I could create art digitally while retaining my own personal style. This is something I could never make happen with a mouse. I’m not even very good using Photoshop for Mac. A few hours later, I was at the Grand Central Apple Store buying the 12.9” as an investment in my art practive. I went with the 12.9” because I wanted the biggest canvas I could get.

I spent the entire weekend making art on my iPad and when Monday rolled around, I picked up my device and carried it to the office. I wanted it by my side at all times. I was still using the MBP but I enjoyed leaving the Mac behind in meetings, where I had my second Aha moment; I find the iPad less distracting and myself more present in meetings while using it to take hand-written notes.

That was the beginning of my journey to learn how to use the iPad as an every day computer. Thanks Khoi & the team at Adobe!

Safari + Magic Keyboard = The Anything Machine

When Apple announced iPadOS in 2019, which importantly included a desktop-class version of Safari, they took the first step towards making the iPad a real work machine. The first version was light years ahead of what they had been shipping with iOS but failed some of my early performance tests. I put the iPad back in my desk and relegated it to a consumption device. Then release after release Safari kept getting better and by the time I ditched my Mac for an iPad Pro in January it could load almost any site. Almost.

While performance was no longer an issue, touch controls could not handle all modern web interaction or on occasion (hi, google analytics) even simple UI elements. Where I could (hi again, google analytics), I would navigate around with keyboard shortcuts, but for sites where I wanted to pan, zoom and manipulate objects, like Figma or Balsamiq it was not possible.

With the release of trackpad support and its flagship product the Magic Keyboard all of this changed. The trackpad functions in Safari now exactly like Mac OS. This really hit me yesterday when someone asked me for a quick wireframe illustrating a conversation we had. We use Balsamiq for quick idea sharing, which has a solid web app. Just as I was going to grab my Mac, I wondered if I could use Balsamiq on my iPad now. Boom! Success! I can drag things around the screen and the cursors is small enough to grab the corners of an element to resize and manipulate it. The idea was captured and shared in the format my whole team expects.

There are many reasons why I think the Magic Keyboard took the iPad to the next level but perhaps the most important is that all of your web tools are now available to you.

The Apple Watch at Home

I have found that the Apple Watch is even more useful to me while being forced to work from home then when I was in the office. Even as I type that it sounds counter-intuitive.

At the office, I’m almost always looking at a screen or in a meeting doing my best not to look at my screen. Glancing at a slack notification on my wrist while meeting with someone is at best distracting and at worst incredibly rude. I find little value in getting notifications on my wrist from people who are in the same building as me.

At home I’m so much more flexible in my day and more likely to walk away from a screen. For better or for worse, life and work is more of a blurred experience, especially when working with an international team and co-workers whose own schedules have been flipped upside down. In return, I might walk away from screens at 11 AM to do the dishes, while parts of my team are their busiest. The Apple Watch keeps me connected and lets me know if anyone is trying to get my attention while I’m taking care of an errand or walking my dog. That reduces the anxiety that I will step away from the screens while someone desperately needs me and helps me manage my work/life balance in a world where everyone knows you are home and likely not that busy.

Slack Browser Preferences

I only discovered this since getting the new iOS version of Slack since it is more intuitive to find your preferences but you can change the browser slack opens links in.

For most apps I enjoy having a simple streamlined in-app browser. As long as it has a simple way to open the link in Safari, then I’m all for it. However I’ve always thought it was an awkward choice to have the in-app browser in Slack. A common use case is that someone shares me a link, the in-app browser takes over, I would have to close the link to respond to it. Then I would often want to refer to the link’s contents and would have to click in and then click close to get back out to continue the chat. I found this particularly frustrating on iPadOS where multi-tasking is so much better than on the iOS. After changing the Web Browser setting I can keep Slack open and have Safari in Split View and reference the content while chatting about it. Problem solved.

You can find this setting on the current iPadOS app by clicking the 3 dots in the upper right of any chat conversation, Settings->Advanced->Web Browser.

With the desktop and iOS app recently refreshed we can only hope they are working on a new iPad app which includes video chat and screesharing.

Contact Tracing & Ubiquitous Computing

If you would have told me in 1997, that giving everyone unfettered access to the world’s information would result in an erosion of public trust, we would have had a philosophical fight. Here we are in 2020 and I was wrong, but that is a conversation for another day.

However one of my favorite paradigms, Ubiquitous Computing, is finally starting to mature 3 decades later in ways we always expected. Mark Weiser coined the term ubiquitous computing in his 1991 paper, The Computer for the 21st Century and I was introduced to it while at Dartmouth (2002’ish) by Professor David Kotz. Weiser’s paper is a classic that is still relevant today. Ubiquitous Computing imagines a world where millions of computers are everywhere and embedded into the environment around us. Those devices can recognize us and then through automation and context computing manipulate the world around you and deliver you high value, contextual information.

At the time, we were thinking about smart dust painted on walls and public displays that you could just walk up to and use. We hadn’t considered 3.5 billion pocket computers with Hi-Rez personal displays. We discussed privacy in theoretical ways but there were few real world examples of digital privacy nightmares in 2002. Then as this new ubiquitous computing environment matured around us, it has been harnessed to create a literal world-changing, life saving app; Privacy-Preserving Contact Tracing by Apple & Google.

Our devices (and advertisers) already know everywhere we’ve been, they should know about every other device we’ve come in contact with and importantly notify us if exposure to one of those devices has put our life in danger. This should be done in a privacy conscious way where your identity is protected, personal information is never shared with Governments or Corporations and all computation happens locally.

This is what Apple & Google have delivered on in a heroic effort that reminds me most of the stories I’ve read about U.S. companies rising to the challenge of World War 2.

Now if only we trusted each other. Perhaps we could agree that savings lives is a more important use case of our data than serving up ads.

iPad but with a matte screen

Summer has arrived in NYC and the weather is nice which makes sheltering in place more... frustrating. I’m fortunate enough to have a balcony and I have plans to spend more time working from it. However anyone who has tried to work with a device in the sun, knows how challenging it can be to see your screens when working directly in the sun.

Enter the Moshi iVisor experiment. The iVisor is screen protector that turns your iPad’s display from glossy to matte, which also makes it visible in the sun. I’m a few hours into using it and it seems to work as I had hoped. I write this from my balcony with the sun in my eyes.

It also adds a little texture to the screen when using the Pencil, which I’ve heard people used to writing on paper like. I’ll probably get into some art later today to test it out. My one concern at the moment is it does ever so slightly, mute all of the colors and while I think that will have little impact on my work, I wonder what it will mean for my art.

I’ll provide updates on this experiment after at least a few days of use.

iA Writer

Alright, this is what I’m actually here for. I have iA Writer hooked up to

I don’t do a ton of writing and when I do I often share it within my organization via Google Docs. First, Google Docs is a terrible experience on iPadOS. It doesn’t support any modern/expected OS features and it has some strange bugs/decision... for instance I can’t great a new line in a Sheet cell.

My work around is that I have a (company issued) MacBook Pro in my (currently home) office which I VNC into via Screens. Works like a champ.

But I was trying to write in praise of iA Writer, which feels like it has been on iOS forever. It’s so incredibly beautiful and peaceful and modern. The font choice, background color and overall feel of the app is incredible. As if Dieter Rams designed a writing app!

I’ve never had a great reason to use it, yet at the same time its always been on my phone and iPad, so here I am on as an excuse to use iA Writer :)

I started this blog to talk about iPadOS, productivity, computing and computer science dreams I still have from 1997.

I’m a Venture Partners and Head of Product at a startup studio called Five Four Ventures. I spend every day building and launching new companies with my iPad Pro as my primary computing device. I’m a mixed-media aritst using mostly acrylic, canvas and iPad and on occasion I still write increasinly worse quality.