Street art, robots and the quest to humanize smart cities
Future Cities Lab Co-Founder Jason Kelly Johnson on pushing the boundaries of the built environment. Read More

Jason Kelly Johnson will be speaking at VERGE 2015, Oct. 26-29 in San Jose, California.
What does a smart city really look like? And more important, what can it actually do for its inhabitants?
These are a few of the big questions that architect Jason Kelly Johnson and his San Francisco design outfit, Future Cities Lab, are looking to explore through their aesthetically pleasing tech experiments in the built environment.
The company employs a mix of designers, developers and engineers for both commissioned works — such as the interactive “Lightswarm” exhibit currently absorbing sonic data at the Yerba Buena Center for the Arts — and R&D projects.
The result is a melding of art, architecture, urban planning, hardware and software that underscores an explosion in urban data collection and analytics capacity.
From better illustrating alarming climate projections to illuminating cities with lights powered by trains or instilling buildings with artificial intelligence, Kelly Johnson shared the firm’s vision in an edited interview with GreenBiz.
Lauren Hepler: Why call it a lab as opposed to an architecture or design firm?
Jason Kelly Johnson: It has to do with the structure of what we’re doing. Mostly we’re developing custom technologies and custom software. I’d say most traditional practices are working with things that are essentially off the shelf, whereas we’re involved in basically a lot of R&D.
Hepler: What are some of the R&D projects you’ve worked on? I saw one image of something called a “Datasprayer” on your website.

Kelly Johnson: When you get a parking ticket in San Francisco, or if there’s an automobile accident or a crime, those things are inputted into a database. San Francisco has an open data website where you can download not only the event that took place but the GIS coordinates — but it takes a while to access it.
The project started by looking at a website mapping all the bicycle accidents in San Francisco and realizing that I travel very regularly through the intersection where there were the most accidents.
Hepler: Where is that?
Kelly Johnson: Mission and 16th, a kind of crazy intersection. The thought was to try and develop a robot that could go out when there was some kind of event and spray paint a symbol of what happened — basically a kind of AI element that would go out and tag sites.
The thing went out in our neighborhood in Dogpatch and painted a toxic symbol in front of buildings that had reported toxics industries in them. We did the same thing with hand gun symbols. So many people don’t have any idea what happens in their neighborhood at 2 in the morning. It’s almost like taking a Google map with this data and imposing it directly onto the city.
Hepler: It’s fusing low-tech street art with high-tech robotics.
Kelly Johnson: That was sort of the idea. We’re outfitting cops with all these tools and tax collectors with their tools. The idea was to give people more knowledge of the city.
Hepler: How could that impact the way we think about cities more broadly?
Kelly Johnson: We did a version in Venice last summer. We had the robot go out and draw the high water mark for the year 2040. Basically half of Venice is supposedly going to be under water in 20 or 25 years, so we found that line and had the robot go out and spray paint the topographic high water mark line through the city as a way to talk about global warming and climate change and flooding.
The idea was to raise awareness. People saw this bright pink line and didn’t have any idea what it was, but then when you started to explain what it was, they were, like, “Wait, holy crap, really?” You’ve seen tons of maps about the exact same topic, but the idea was to bring that data out into the city.
Hepler: So you’ve mentioned climate and chemicals, but how does sustainability figure into the overall work that you’re doing?
Kelly Johnson: We’re looking at integrating technologies, like making buildings from the inside out smart and sustainable. You can do smart things using materials and using things like sensors and something we’re calling forward sensing. Right now there’s such a focus on having sensors integrated inside smart buildings and smart homes.
We’re interested in how the AI of a building actually starts to harness things like social media and the predictive data that’s already kind of out there. It’s not just about a building being responsive and reacting instantaneously. It’s about a building becoming predictive and evolutionary.
Hepler: I also saw some on your site related to machine learning and other kinds of robotics. How does that all fit in?
Kelly Johnson: We’re doing some things looking at pattern recognition. The same way you might be walking through a city and realizing, “Wow, it’s starting to get really crowded,” how do computers begin to understand the same qualities that humans do?
Hepler: Who is your target audience for all of this? Are there real estate developers or corporate campus managers actually thinking about these types of technologies?
Kelly Johnson: We’re mostly producing pieces that are commissioned, essentially art pieces at this point. The people that we’re finding are most interested are a little more forward thinking and want to explore it in a gallery or public spaces. We have a few clients we’re working with, but they’re mostly technology companies selling microchips and working with data.
We’re funded through a couple different things. It isn’t a building owner thing yet, but we can imagine it will be at some point. We’re most interested in being at the forward edge of it all, and companies can begin to spread out off of that.
Hepler: What cities are you working in?
Kelly Johnson: We’ve been doing projects in New York. We’re doing a really large project in Washington, D.C., called Lightweave. It’s an architectural -scale project that’s right near the U.S. Capitol Building and Union Station in a neighborhood called NoMa (North of Massachusetts Avenue).
Hepler: I worked there before it was called NoMa. Very industrial.

Kelly Johnson: Yeah, they renamed it. But that whole neighborhood got together and found funding. It’s partially funded by the federal government. They’re paying for us to do a huge project as long as a football field. It will be an interactive art piece in these terrible underpasses. I mean they’re beautiful underpasses, but they were formerly really scary.
We’re doing a really, really large-scale light installation using vibration sensors. As trains go over top, it makes these light effects in the tunnel. It’s going to be a really beautiful woven steel lattice, and then it will have a set of ambient LEDs in these tubes. It’s basically an art piece in the end, but it’s urban architecture.
Hepler: There are also a lot of big companies are talking about things like smart lighting and dynamic advertising in smart cities. How do you see your work fitting in?
Kelly Johnson: A lot of the stuff we’re seeing coming out of really large companies is all about optimization. It’s about increasing efficiency, increasing speed, increasing profit margins.
Most of what we’re up to is kind of counter to that. We’re actually trying to slow people down with technology and make them more thoughtful, more connected, more social. We are kind of piggybacking off of their technologies, definitely, but I think our agenda is different.
Hepler: So how do you approach a macro tech trend like the Internet of Things?
Kelly Johnson: It’s pretty obvious to us that objects as we know them are going to start to be threaded with sensing capacity. You’ll start to know their X-Y-Z coordinate in the world. That alone is intriguing, but begin to couple geographic location with other things, like sound levels, ambient air temperature. You start to know a lot about the qualities of things.
You begin to think that computers and the way computers are dealing with the world will be more complex. But how do we treat that? How does that actually become not just a tool to sell us things, but to make people have a more in-tune understanding of what their effect is on the environment — not just to encourage us to consume, but to give us knowledge to do things in a more informed manner.
Hepler: So you see one potential benefit as better stewardship?
Kelly Johnson: Think of it this way. Humans have never been able to see around corners. We’ve been able to hear things and intuitively think about them, but we’re gaining capacity to see things far away.
We’re basically creating the capacity for humans to become super sensitive. Your brain through your smartphone is being connected to vast vast databases of knowledge that are being updated in real time. Your brain suddenly has so much more potential than it ever has, for better or worse.
