The Metropolitan Museum of Art, New York: Zemí Cohoba Stand
Created by: The Imaging Department (Barbara Bridgers, Scott Geffert, Xue Chen and Deepa Paulus), The Metropolitan Museum of Art, New York
An augmented reality experience that allows you to place a zemí cohoba stand from around AD1000—the centrepiece of Arte del Mar: Artistic Exchange in the Caribbean at the Metropolitan Museum of Art— in your own surroundings. As previously reported in The Art Newspaper, the wooden sculpture probably originated in today’s Dominican Republic. It is the central object in the Arte del Mar show, which was closed because of the Covid-19 pandemic in mid-March, and reopened with the rest of the museum on 29 August. The exhibition runs until 27 June 2021.
Where to Find It
On the Met's website. Using browser-based Web Augmented Reality (WebAR). No app download required.
The AR model works on iOS devices, and best on an iPhone running iOS 13.4.1 or later. There is also a browser version for laptops, where you can at least experience the work in 3D. Our panellists were troubled by the lack of an Android version, given that globally Android users outnumber iOS users by 3 to 1.
They Say:
To share this zemí beyond the walls of the Museum in a time of suffering keeps with the original intent of providing inspiration to all those who experience its beauty
The Metropolitan Museum of Art: The Taíno concept of zemí pertains to the force of deities and ancestors that permeates the Caribbean landscape. This rare wooden image harnesses that environmental power into a particular zemí, a central figure in community ceremonies, including healing. To share this zemí beyond the walls of the Museum in a time of suffering keeps with the original intent of providing inspiration to all those who experience its beauty.
The XR panel's ratings
Carole Chainon: 3D object placement experiences using WebXR started emerging about 2 years ago—an example being David Bowie’s spectacular costumes in augmented reality, presented by The New York Times in early 2018. The experience starts right from the article/browser, sparing the user from downloading an app. The user is instructed to scan the room for horizontal plane detections and the artwork is then placed directly in the user’s environment. Interactions are minimal: selecting the audio language and rotating/scaling the artwork. The quality of the artwork is the most impressive feat in this experience.
Dhiren Dasu: I don’t feel that it conceptually breaks any new ground. Having said that, the level of rendering and detail inherent in the implementation is the best of any AR I’ve seen. So kudos to the imaging team.