Smart glasses are the holy grail of augmented reality, which promises to overlay digital content on top of the physical world (think Pokémon Go, except with a lot more than cartoon characters). Seemingly every company in Silicon Valley is interested in AR, but on Wednesday, Facebook took a big step to the front of the pack with its announcement of “Project Aria.” The company, which owns the Oculus VR headset, will be sending smart glasses into the wild on the heads of Facebook testers. An initial group of 100 or so testers will be gathering data to build a personalized assistant and, importantly, provide a first-person perspective to map the real world for AR.

The tech on these headsets, which Facebook insists is neither a consumer product nor a prototype, is impressive. High-resolution cameras and microphones help the glasses see our world; an array of sensors allows the device to understand where it exists and how it is oriented in physical reality. The buzzphrase to explain how spatial computing works is “simultaneous localization and mapping,” or SLAM, but the end result is that smart glasses will need to hoover up a huge amount of raw data just to function.

Some may see this as the latest iteration of Google Glassholes unleashed on the world, but the technology has advanced by leaps and bounds since Google first tried to make it happen in 2013. To its credit, Facebook has announced a suite of privacy protections for Project Aria. Data will be quarantined from the Facebook mothership and accessed only by some small subset of Facebook researchers. Where and what will be recorded by the smart glasses is also limited in some respects. Taking a page directly from Google Street View’s struggles to map the world, Facebook will automatically blur faces and license plates. Research testers won’t use the glasses in locker rooms and other private areas, and Facebook Reality Labs even announced a set of very high-level responsible innovation principles. It all signals an understanding by the company that AR will strike some as very creepy.

Facebook’s critics likely won’t be mollified by any of this. Facebook’s track record when it comes to respecting its users’ privacy and handling data ethically is less than spotless, and it will take a huge lift to ensure that however Project Aria shapes AR doesn’t create another forum for conspiracy theorists and hatemongers.

Project Aria is an important part of Facebook’s effort to build what it calls LiveMaps, which is what others in this space refer to as a “digital twin” or “AR Cloud.” Basically, it’s a machine-readable, 1-to-1 scale model of our world that is continuously updated and annotated in real time. This is the scaffolding upon which we can replace billboards with digital advertising or let virtual holograms appear as if they’re interacting seamlessly with whatever we see. A digital map promises to be a big boon for accessibility, providing digital eyes and ears to those with disabilities, and the capacity for AR to help us become more aware of our surroundings is enormous. Navigation, urban planning, and city infrastructure all benefit from the AR Cloud, and smart cities are eager to build a “virtual model of all the critical elements of their city.” It isn’t just companies interested; city officials and governments see the potential as well.

However, the data needed to power this vast digital twin inherently reveals the identity, location, and behavior of people in public or in the privacy of their own homes. The sheer volume of information needed to map and track people, places, and objects in real time in a way that creates a shared AR environment is massive, combining sensor data, location information, and raw video and audio streams. There’s a reason a digital duplicate of our physical world is a project that has been called a “total surveillance state.”

It matters who is building this world. Facebook is not the only company interested in mapping AR—Pokémon Go developer Niantic is building a “Real World Platform,” and one-time immersive tech darling Magic Leap lets spatial maps be shared across devices and apps via the cloud. But Facebook has a distinct advantage in this race because of its ubiquity and resources. While Project Aria is a separate effort, Facebook’s plans for AR will be aided by the company’s vast stores of location data and some of the most advanced machine-learning capabilities on the planet.

Maps hold tremendous power. They not only help people navigate the world, but they also establish boundaries and shape our perceptions. Mapping technology is equally important. Global navigation systems are military assets, and Apple publicly apologized for the shaky launch of its mapping app in 2012. We have gotten used to mapping roads, but AR changes the game by encouraging us to map every square foot of space on the planet.

Privacy groups like the Electronic Frontier Foundation have lamented that if privacy “dies in [virtual reality], it dies in real life,” but the proposed solution is often for companies to collect less data or get more consent from users. But if AR tech becomes as ubiquitous as the smartphones in our pockets, that is unlikely to work. Facebook already has pledged that Project Aria won’t be used in areas like restrooms, prayer rooms, and locker rooms and will only be allowed in homes where the consent of everyone in the household is obtained. That might bring some comfort for a limited-run research project, but what happens when everyone can use this tech? Already, some immersive headsets have collective mapping capabilities where information once shared can never be deleted. Once you’ve given up the layout of your bedroom, that map is available for anyone else to use.

We could technically handicap our digital twins. For example, devices could be limited to only localized mapping, where a user device locally creates anchors within a space like an apartment or office that immersive tech can keep track of over time. But a crowdsourced global spatial map is what evangelists envision will be necessary for mass adoption of smart glasses.

So we need a new rulebook for how the digital world can be mapped and annotated. Nonprofit stakeholders have begun to recognize the inherent privacy of homes, as well as other sorts of private property interests in AR. The Open AR Cloud put forward a privacy manifesto that calls for limits on how AR data can be shared, and the XR Safety Initiative’s recent privacy framework calls for the ability to designate private areas and move mapping data among AR Clouds. (In full disclosure, I contributed to this framework.)

These issues are already manifesting themselves. In 2018, a group of renegade artists modified the Jackson Pollock gallery at the New York Museum of Modern Art via an AR app. A MoMAR app framed a Pollock painting in an interactive representation of a smartphone running Instagram, while another painting was overlaid with right-wing conspiracy theories peddled by QAnon. At the Isabella Stewart Gardner Museum in Boston, an independent effort reinserted 13 stolen paintings into the museum through AR. These examples may sound playful or harmless, but they point to a future where AR is used to map and vandalize our most intimate spaces.

Facebook has never been great at setting limits for itself, but AR will need limits. We may not want parts of our world to be mapped and augmented. The laws, regulations, and norms that govern the real world are different from the digital, but AR is poised to bring everything about the online world—good and bad—to physical reality. We should not wait to have these conversations after Facebook and its friends have already mapped things for us.

Future Tense
is a partnership of
New America, and
Arizona State University
that examines emerging technologies, public policy, and society.

You’ve run out of free articles. Join Slate Plus to continue reading, and you’ll get unlimited access to all our work—and support Slate’s independent journalism. You can cancel anytime.

Start Free Trial Now

Already a member?
Sign in here.

Start Free Trial Now

Already a member?
Sign in here.

Slate is published by The Slate Group, a Graham Holdings Company.
All contents © 2020 The Slate Group LLC. All rights reserved.

Slate relies on advertising to support our journalism. If you value our work, please disable your ad blocker.

By joining Slate Plus you support our work and get exclusive content. And you’ll never see this message again.

“)), n = v(f[r.size_id].split(“x”).map(function (e) {
return Number(e);
}), 2), i.width = n[0], i.height = n[1]), i.rubiconTargeting = (Array.isArray(r.targeting) ? r.targeting : []).reduce(function (e, r) {
return e[r.key] = r.values[0], e;
}, {
rpfl_elemid: s.adUnitCode
}), e.push(i)) : g.logError(“Rubicon: bidRequest undefined at index position:”.concat(t), c, d), e;
}, []).sort(function (e, r) {
return (r.cpm || 0) – (e.cpm || 0);
getUserSyncs: function getUserSyncs(e, r, t, i) {
if (!R && e.iframeEnabled) {
var n = “”;
return t && “string” == typeof t.consentString && (“boolean== typeof t.gdprApplies ? n += “?gdpr=”.concat(Number(t.gdprApplies), “&gdpr_consent=”).concat(t.consentString) : n += “?gdpr_consent=”.concat(t.consentString)), i && (n += “”.concat(n ? “&” : “?”, “us_privacy=”).concat(encodeURIComponent(i))), R = !0, {
type: “iframe”,
url: a + n
transformBidParams: function transformBidParams(e) {
return g.convertTypes({
accountId: “number”,
siteId: “number”,
zoneId: “number”
}, e);

function _(e, r) {
var t,
i = 0 e.length) && (t = e.length);

for (var r = 0, n = new Array(t); r ‘;
var r, n;

var a = function a(e) {
var r = 0 = e && r.innerWidth r.length) && (e = r.length);

for (var t = 0, n = new Array(e); t b ? a : b;
* Fast loop through watched elements

function onScroll() {
* updates seen property
* @param {Visble} item
* @param {{}} Possibly
* @fires Visible#shown
* @fires Visible#hidden

function updateSeen(item, Possibly) {
var px = evt.visiblePx,
percent = evt.visiblePercent; // if some pixels are visible and we’re greater/equal to threshold

if (px && percent >= item.shownThreshold && !item.seen) {
item.seen = true;
setTimeout(function () {
item.trigger(“shown”, new VisibleEvent(“shown”, Possibly));
}, 15); // if no pixels or percent is less than threshold
} else if ((!px || percent = 0 && rect.left >= 0 && rect.bottom 1) {
result += getLinearSpacialHash(remainder, Math.floor(stepSize / base), optimalK – 1, base);

return result;
* @param {ClientRect} rect
* @param {number} innerHeight
* @returns {number}

function getVerticallyVisiblePixels(rect, innerHeight) {
return min(innerHeight, max(rect.bottom, 0)) – min(max(, 0), innerHeight);
* Get offset of element relative to entire page
* @param {Element} el
* @returns {{left: number, top: number}}
* @see

function getPageOffset(el) {
var offsetLeft = el.offsetLeft,
offsetTop = el.offsetTop;

while (el = el.offsetParent) {
offsetLeft += el.offsetLeft;
offsetTop += el.offsetTop;

return {
left: offsetLeft,
top: offsetTop
* Create a new Visible class to observe when elements enter and leave the viewport
* Call destroy function to stop listening (this is until we have better support for watching for Node Removal)
* @param {Element} el
* @param {{shownThreshold: number, hiddenThreshold: number}} [options]
* @class
* @example this.visible = new $visibility.Visible(el);

Visible = function Visible(el, options) {
options = options || {};
this.el = el;
this.seen = false;
this.preload = false;
this.preloadThreshhold = options && options.preloadThreshhold || 0;
this.shownThreshold = options && options.shownThreshold || 0;
this.hiddenThreshold = options && min(options.shownThreshold, options.hiddenThreshold) || 0;
updateVisibility(this); // set immediately to visible or not

Visible.prototype = {
* Stop triggering.
destroy: function destroy() {
// remove from list
list.splice(list.indexOf(this), 1);
* @name Visible#on
* @function
* @param {‘shown’|’hidden’} e EventName
* @param {function} cb Callback

* @name Visible#trigger
* @function
* @param {‘shown’|’hidden’} e
* @param {{}}


VisibleEvent = function VisibleEvent(type, options) {
var _this = this;

this.type = type;
Object.keys(options).forEach(function (key) {
_this[key] = options[key];
}; // listen for scroll events (throttled)

$document.addEventListener(“scroll”, _throttle(onScroll, 200)); // public

this.getPageOffset = getPageOffset;
this.getLinearSpacialHash = getLinearSpacialHash;
this.getVerticallyVisiblePixels = getVerticallyVisiblePixels;
this.getViewportHeight = getViewportHeight;
this.getViewportWidth = getViewportWidth;
this.isElementNotHidden = isElementNotHidden;
this.isElementInViewport = isElementInViewport;
this.Visible = Visible;
}, {}];
require=(function e(t,n,r){function s(The,u){if(!n[The]){if(!t[The]){var a=typeof require==”function”&&require;if(!u&&a)return a(The,!0);if(i)return i(The,!0);var f=new Error(“Cannot find module ‘”+o+”‘”);throw f.code=MODULE_NOT_FOUND”,f}var l=n[The]={exports:{}};t[The][0].call(l.exports,function(e){var n=t[The][1][e];return s(n?n:e)},l,l.exports,e,t,n,r)}return n[The].exports}var i=typeof require==”function”&&require;for(var o=0;The


Facebook, Smartglasses, Augmented reality, Ray-Ban, Glasses

World news – THAT – Facebook’s New Project Aria Could Be an Augmented Reality Turning Point

Building on its expertise in the areas of digital, technologies and processes , CSS Engineering you in your most ambitious transformation projects and helps you bring out new ideas, new offers, new modes of collaboration, new ways of producing and selling.

CSS Engineering is involved in projects each customer as if it were his own. We believe a consulting company should be more than an advisor. We put ourselves in the place of our customers, to align we incentives to their goals, and collaborate to unlock the full potential their business. This establishes deep relationships and enjoyable.

Our services:

  1. Create professional websites
  2. Hosting high performance and unlimited
  3. Sale and video surveillance cameras installation
  4. Sale and Installation of security system and alarm
  5. E-Marketing

All our achievements here


Please enter your comment!
Please enter your name here