1. prostheticknowledge:

    Punctuate

    Creative coding project by Jason Lin can convert text into 3D geometric drawings, turning writing into a visual grammar - video embedded below:

    My latest Processing project!

    … I came up with this idea because my last project used an excel sheet and received insane amounts of numbers and data.
    This time I wanted to use a text file and receive insane amounts of words and letters and most importantly, punctuation!

    I had seen pictures of “sentence maps” before where a line was created and it got longer with every word and made a turn every time the sentence ended. Colors would change with every character or some other factor.

    Basically I wanted to take this idea and make it HUGE. I wanted an entire 3D explore-able environment.

    The video explains what I chose to do for every single type of punctuation mark.

    More at Jason’s art blog here

    The project hasn’t been made available to the public yet, but at Jason’s Tumblr blog (obeserhino) you can send him suggestions to try out. [Link]

     


  2.  


  3.  


  4.  


  5. new-aesthetic:

    Satellite Lamps by Einar Sneve Martinussen, Jørn Knutsen, and Timo Arnall.

    The city is changing in ways that can’t be seen. As urban life becomes intertwined with digital technologies, the invisible landscape of the networked city is taking shape—a terrain made up of radio waves, mobile devices, data streams and satellite signals.

    Satellite Lamps is a project about using design to investigate and reveal one of the fundamental constructs of the networked city—the Global Positioning System (GPS). GPS is made up of a network of satellites that provide real-time location information to the devices in our pockets. As GPS has moved from specialized navigation devices to smartphones over the last 10 years, it has become an essential yet invisible part of everyday urban life.

    William J. Mitchell (2004) described the landscape of the networked city as an invisible electromagnetic terrain. In Satellite Lamps we explore and chart this terrain, showing how GPS is shaped by the urban spaces where it is used. GPS, alongside wireless networks, algorithms, and embedded sensors, is among the invisible technological materials that comprise many modern products. Created by a small team of design researchers at the Oslo School of Architecture and Design, Satellite Lamps is a part of our ongoing research into making technologies visible and communicating and interpreting their presence in daily life. As designers we typically shape how technologies like GPS are being used, but with Satellite Lamps we use our practice to address how they can be understood.

     
     


  6. (Source: brucesterling)

     


  7.  


  8. blech:

    Mayflies, La Crosse, Wisconsin, from CityLab (via bluecloudsfloating)

     


  9. (Source: vimeo.com)

     
     


  10. hellocatfood:

    RGB.VGA.VOLT from James Connolly on Vimeo.

    RGB.VGA.VOLT is an audio/video synthesizer that enables realtime exploration of the rich materiality concealed beneath the consumer interfaces of cathode ray tube computer monitors. By hacking and improperly rewiring the cables of these obsolete devices to short their video signals, their black-boxed analog infrastructure is liberated and driven by digitally synthesized high-frequency complex waveforms, audio playback, and feedback loops that fully exploit their latent visual spectrum. Inspired by early video tools such as the Sandin Image Processor and the Paik-Abe Synthesizer as well as contemporary programming and error-based approaches to sound and moving image, RGB.VGA.VOLT revives an analog aesthetic that has acquired a renewed power and potency in the present era of digital immateriality.

    The instrument, demonstrated on four VGA computer monitors in this video, combines custom built software, hardware modification, and DIY instrumentation.

    jameshconnolly.com/rgb-vga-volt
    COPY-IT-RIGHT 2014

     
     


  11. right-to-flight:

    One of a series of videos produced from aerial footage, shot with a GoPro camera from the Right To Flight balloon, two hundred feet above the streets of Peckham, South London.

    These videos continue my use of the “rorsching” technique, previously used on Google Maps, Street View, and NYC traffic cameras (rorschmap.com), to transform automated and potentially infinite, “neutral” data and footage into abstracted artworks, inviting contemplation and analysis while reducing their utility as information or surveillance.

    (via stml)

     
     


  12. (Source: melanieflood, via selfctrl)

     


  13. Sounds of Streetview. Embed sound files within Google Streetview which play as you move around.

    Their implementation is pretty dull but the tool has some potential for interestingness, I think.

     
     


  14. First-person Hyperlapse Videos

    We present a method for converting first-person videos, for example, captured with a helmet camera during activities such as rock climbing or bicycling, into hyperlapse videos: time-lapse videos with a smoothly moving camera.

    At high speed-up rates, simple frame sub-sampling coupled with existing video stabilization methods does not work, because the erratic camera shake present in first-person videos is amplified by the speed-up.

    Our algorithm first reconstructs the 3D input camera path as well as dense, per-frame proxy geometries. We then optimize a novel camera path for the output video (shown in red) that is smooth and passes near the input cameras while ensuring that the virtual camera looks in directions that can be rendered well from the input.

    Next, we compute geometric proxies for each input frame. These allow us to render the frames from the novel viewpoints on the optimized path.

    Finally, we generate the novel smoothed, time-lapse video by rendering, stitching, and blending appropriately selected source frames for each output frame. We present a number of results for challenging videos that cannot be processed using traditional techniques.

     
     


  15. prostheticknowledge:

    Atonal Architectonics : Ziggurat (Archstoyaniye 2014)

    Audio visual location-specific sound installation by SoundArtist.ru uses field recordings and an architectural design as a notation score - video embedded below:

    Documentation of the generative sound-sculpture produced for Archstoyaniye festival in July 2014 (Nikolo-Lenivetz, Russia).
    The work uses spectral analysis to synthesise achromatic sound scale in formal relation to the architecture of the building and using filed recordings as the only source of timbre.

    Link

    Very much related to what I’m working on, but completely different. Curious.