Shared posts

22 Mar 20:17

7 Things You Know in UX That Will Help You Design Chatbots

by Jesús Martín

At BEEVA we have been working with chatbots for some time now. In addition to the design framework I previously published, I’m going to share seven things that I brought from my UX design background that helped me approach the chatbots universe.

Visibility of System Status

It’s the first heuristic from Jakob Nielsen and one of the first you will find in every usability checklist. Applied to the chatbot interaction, it’s a must to have a kickass welcome message that sets user expectations and makes it clear which functionalities the chatbot will cover. It may seem obvious that chatbots only cover a small scope, but if you don’t make the scope clear, users will crash with everything your chatbot is not able to meet their expectations.

In addition to the welcome message, we need to remind users where they are and what they are doing. Users that are not paying attention to our chatbot can get lost many times on a single conversation.

Example of welcome message used on Maya a in-house chatbot for BEEVA

Personality

It’s a concept commonly underrated on interfaces, but it’s one of the fundamental basis of chatbots. It was perfectly studied on Designing for emotion, and the best example ever designed is the personality crafted for Mailchimp (here).

Because the interaction is happening on a chat, the user implication level is really high, and many users will receive the conversation as something real. Because of this, it’s extremely important to work on the chatbot voice and tone, the bot’s humor, and even its hair color.

My advice here is to create a user persona profile with your chatbot’s personality to guide all of your messages. If you have no clue about where to start, here is a template you can use.

Personality template

Interaction Elements

Words. That’s it… nope!

There are already some great articles talking about the interaction elements we have nowadays (this one with seals is my favourite one).

We could also make a timeline about the evolution and adoption of the different interaction elements, but the conclusion would be the same. The trend is to avoid any conversational interaction, replacing it with all sort of complex interactions, and transforming the old chat into something like an embedded web page.

Here you have an interesting article explaining how Google paved the way some time ago: How Google’s AI paved the way for the next generation of bots.

We have experienced real conversations bringing headaches and malfunction problems. This can be avoided by offering the users buttons and links to click on. Artificial intelligence and NLP are able to do amazing magic, but they don’t meet user expectations yet. You can read more about it here.

Every platform has different elements, and they change all the time, so you better update your knowledge before starting a project.

Some of the interaction elements on Facebook messenger

Information Architecture

Even in a chat we need to decide which information will be offered first and which information we are hiding. We can understand the chatbot flow like a screen flow with little information in every step, and different interaction elements on each part, where some actionable “buttons” are just words users can type.

Because of that I treat the information architecture as in any other interface by defining which interaction elements I will show on every step.

Navigation Flow

We need to design the navigation map for our chatbot in which we’ll define what will happen in every single step.

We should be thinking about every possible scenario on every step: which interaction element the user will be using, what will happen if they misunderstand any step like “What if they want to go backwards,” “What about skipping steps?” etc.

You can find more about how I do it on my Design framework for chatbots.

Basic navigation flow for Maya, a BEEVA chatbot

Prototyping Tools for Non-Developers

The same way we can find interface design tools, chatbots tools are beginning to spread. I differentiate them in two groups: low fidelity and high fidelity.

In the first group, there are tools that help us explain our chatbots and in a short time, we can get great results and be ready for testing much more quickly. There are a bunch of these, but my chosen ones are:

  • https://botsociety.io: Allows you to export the designed conversation on a video. They are adding more functionalities in order to allow more complex conversations and even different flows on a conversation. They are probably the leaders right now in this area.
  • https://botframe.com: you can generate a quite customizable conversation that you can export to png.

For high fidelity tools, we end up developing a real chatbot without writing a single line of code, but obviously with limited functional complexity. About them I would recommend:

  • https://octaneai.com: the position they are reaching in less than one year is something to be admired. I like their proposition about convos and the poll functionality.
  • https://chatfuel.com: it’s probably the rival to beat, but they haven’t changed very much in the last few months and their competitors have. They offer chatbots for Facebook Messenger and Telegram.
  • https://rundexter.com: It was the first tool I discovered even when they weren’t focused 100% on chatbots, but they remain quite aside, and they seem to be oriented to developers.

Analytics Tools

There are already a few players willing to become the Google Analytics for chatbots. Here are some of them, plus some other tricks that will help you understand what’s going on with your chatbot:

  • http://botanalytics.co: They offer a lot, but the results they give are still quite far from expected.
  • https://www.dashbot.io: The best one we have tested, but I still have some information missing… Besides the figures they offer I’d love to have information related to my chatbot performance and specially about funnels and how users are leaving them.
  • In-app solutions. Some of the chatbot platforms are already offering some data, like chatfuel and octaneai itself.
  • Direct access to Facebook Messenger conversations. Every chatbot on Messenger is associated to a Facebook page which means you can review the automatic conversation at any time, and even take part on it. Basically you can read every single thing a user says to your chatbot. Remember that the next time you say any bad word to an assistant.

Conclusion

Chatbots are one of the buzzwords nowadays, and they may eventually replace most of the interfaces we are used to. It will take time, and maybe the definitive chatbots and virtual assistants won’t look like those we have today. Don’t worry though, as a UX designer you are already a master of most of the things you need to work on its design.


7 Things You Know in UX That Will Help You Design Chatbots was originally published in Chatbots Magazine on Medium, where people are continuing the conversation by highlighting and responding to this story.

21 Mar 08:41

Internet craque complètement pour ce Shiba peintre

by Tanja M. Laden

Il y a des chiens de chasse, des chiens de sauvetage, des chiens policiers, des chiens de traîneaux, des chiens de concours, des chiens qui font des tours, des chiens qui ne font rien du tout. Il y a maintenant un chien peintre : il s'appelle Hunter et vient de Edmonton en Alberta, Canada. Alors que votre chien ne sert qu'à vous lécher le visage et à vous coûter un max de fric, Hunter se spécialise dans l'expressionnisme et rapporte du blé à ses maîtres en vendant ses toiles sur Etsy.

Ses maîtres s'appellent Kenny et Denise, sont respectivement ingénieur informaticien et professeure. Ensemble, ils ont patiemment appris à Hunter à se servir d'un pinceau, d'une palette et d'un châssis pour qu'il puisse exprimer ses doutes, ses joies et son mal-être de chien japonais déraciné.

Gold Coast

« Comme tous les chiens, il aime faire des tricks, surtout si ça lui rapporte une friandise. Mais avec lui, cela va plus loin. Quand il apprend quelque chose de nouveau, on voit vraiment qu'il est fier de lui. Ce n'est pas juste la bouffe qui le motive. On sent bien qu'il aime apprendre et découvrir de nouvelles choses, même si au début il n'y arrive pas. Je pense que c'est quelque chose que partagent la plupart des artistes et la plupart des chiens. »

Midnight Reverie

Inutile de vous dire que la carrière de Hunter fait un carton… essentiellement en ligne pour le moment, aucune galerie ne s'étant encore proposée.

« On ne pense pas qu'il est particulièrement doué, pas plus que d'autres chiens ou espèces. On a déjà vu des chiens faire des choses que Hunter ne pourrait vraiment pas réaliser. Comme avec les humains, cela dépend vraiment de la personnalité et de ses goûts. Et puis, il faut être patient et vraiment s'investir dans la passion de son chien. Il faut les encourager sans cesse à repousser leurs limites ».

Selfie

18 Mar 15:08

Arrêtez tout, on vient d’inventer le scotch pour LEGO

by Julien

Sur IndieGoGo, l’entreprise Nimuno commercialise le premier adhésif de l’histoire compatible avec les briques LEGO et plein d’autres pièces de jeux de construction. L’idée est grandiose ! Les briques se fixent sur le scotch de génie, qu’on peut couper, tordre, plier, coller, décoller et recoller partout. Murs, angles, meubles, vitres… n’importe quelle surface devient un véritable terrain de jeu.

un véritable terrain de jeu

A l’origine du projet: Anine Kirsten et Max Basler, deux designers industriels du Cap en Afrique du Sud. Trois jours seulement après le lancement de son financement participatif sur IndieGoGo, Nimuno a déjà rempli ses objectifs avec quasi 12 000$ récoltés sur les 8 000$ demandés. Il reste un mois avant la fin de l’opération et l’entreprise annonce déjà développer de nouvelles couleurs. Comptez 8$ par rouleau de ruban adhésif, pour une sortie prévue en juillet 2017 😉

Par Margaux Couturier.

18 Mar 13:36

HTC Announces Vive Ecosystem Conference

by Jamie Feltham
HTC Announces Vive Ecosystem Conference

HTC might not have a big western developers conference like Oculus does with Connect, but it’s planning a big show for China in just over a weeks’ time.

The Vive Ecosystem Conference 2017 is set to take place from March 27th – 28th in Shenzhen, China. HTC has told UploadVR that it’s planning some new announcements for the show, though we don’t have any specifics just yet. A keynote speech is set to take place on the first date of the show at 9:00 AM GMT+8 where, according to a rough translation of an invitation, the company will “announce new strategic cooperation and the latest initiatives to build the global VR ecosystem.”

Exactly what that means is up for speculation, but HTC is no stranger to strategic alliances in China and the rest of the world; it setup the VR Venture Capital Alliance (VRVCA) to bring together investors to fund VR startups and is working closely with others to establish VR within China. On the ecosystem side, it’s quite possible that we get an update on the new Vive Tracker add-on, which is rolling out to selected developers now and will be on sale soon. Don’t expect big new Vive games and experience to be revealed, then, but there could be some interesting announcements.

The second day of the event looks to be a more educational experience with seminars about the best practices for VR. There will be an experience area too where we’d expect attendees will get hands-on with the latest Vive demos.

We’ve reached out to HTC for more info about the event. We’re not sure if there will be a livestream though, if there is, it’s likely to all be in Chinese. Still, those that like to keep a close eye on the Vive ecosystem will want to pay attention to any news coming out of the event here.

Tagged with: htc vive, Vive Ecosystem Conference

18 Mar 13:33

Worldwide VR / AR headset market to hit 100 million devices

by Deniz Ergürel

Your weekly guide to Virtuality — Newsletter #57

Photo Credit — MetaVision

Although augmented reality (AR) remains as the minority portion of the market in terms of shipments, International Data Corporation (IDC) expects AR to bring in significantly more revenue over the course. AR headset sales will grow from $209 million in 2016 to $48.7 billion in 2021. Meanwhile, VR headsets grow from $2.1 billion in 2016 to $18.6 billion in 2021. Both markets are expected to grow at a compound annual rate of 58%, reaching 99.4 million units in 2021… (Read more on IDC Blog)

This is Haptical weekly newsletter #57. We bring you an in-depth look at the latest Virtual, Augmented and Mixed Reality technologies, services & products.

A very special thanks to John Buckley, Joseph Elichaa, Lisa Winter and others for starting a membership with Haptical.

If you want to get a deeper understanding of the state of virtual reality, you can sign up to our exclusive reports today. Go to bit.ly/joinhaptical and become a member now!

LATEST NEWS

  • Troubled GoPro deepens layoffs, eliminating positions in media and virtual reality divisions. (Forbes)
  • Skydance, Sony Pictures debut their first virtual reality titles. (Hollywood Reporter)
  • NVIDIA pushes virtual reality play with new benchmarking tool, FCAT VR (ONMSFT)
  • The imaging technology startup Avegant says its light field prototype provides a truly immersive experience that is critical for the future success of the entire mixed reality industry. (Seeker)
  • Berkeley University opens a new lab for virtual and augmented reality. The lab will be open for faculty members, graduate and undergraduate students. (Haptical)

NEW EXPERIENCES

  • Volkswagen takes virtual route to develop the car of the future. (Auto Car Pro)
  • Students are developing a virtual reality game based on James Joyce’s “Ulysses” as part of a class at Boston College. (New York Post)
  • Airlines Technology has worked up a proof of concept which could allow airlines to sell assigned seats via a virtual reality (VR) seat map. (Tnooz)
  • Accenture creates Mixed Reality experience in partnership with RBS 6 Nations. (Sport Techie)
  • Civil Air Patrol uses virtual reality to spark interest in aviation. (Chicago Tribune)

INVESTMENT

  • Augmented reality company WayRay has raised $18 million in a Series B round led by China’s Alibaba Group.

WHAT PEOPLE SAY ABOUT VIRTUALITY

  • This year, for the first time in its 31-year history, SXSW organizers created a “Virtual Cinema,” where VR/AR content creators could show off their best stuff. USA Today’s Rick Jervis headed up to the fourth floor of the JW Marriott in Austin to check out the cinema and see where the technology stands. (USA Today)
  • StoryUP’s Chief Storyteller Sarah Hill says VR non-fiction storytelling could influence anti-anxiety states. (Haptical)
  • Matthew Busel says, 2018 will be the year most of the country becomes truly familiar with AR (Haptical)

DON’T MISS!

  • Listen to these podcasts to dive into virtual reality. We curated some great episodes with our friends at RadioPublic. (Haptical)
  • A new concept design visualizes fully glass iPhone that integrates Siri with augmented reality features. (Haptical)

Follow us on Medium to get the latest Virtual Reality trends and news in your inbox every week.

Sign up to our exclusive reports today. Discover how virtual reality changes everything you knew about education, entertainment, health care and marketing.


Worldwide VR / AR headset market to hit 100 million devices was originally published in Haptical on Medium, where people are continuing the conversation by highlighting and responding to this story.

15 Mar 22:19

Weta Workshop’s Ghost in the Shell Major Skeleton is a Masterpiece [Video]

by Geeks are Sexy

Adam Savage gets up close with the one-of-a-kind 3D-printed endoskeleton Weta Workshop made for the upcoming Ghost in the Shell. Chatting with Weta Workshop technician Jared Haley in the studio’s 3D modeling room, Adam learns about the experimentation and prototyping necessary to make this gobsmackingly beautiful prop, which is made of several hundred individual pieces!

[Tested]

The post Weta Workshop’s Ghost in the Shell Major Skeleton is a Masterpiece [Video] appeared first on Geeks are Sexy Technology News.

15 Mar 07:58

Google Home débarque en Europe au printemps

by Marie Ciolfi
Si l'on sait que Google Assistant arrivera en France et dans la langue de Molière en septembre prochain, nous savons désormais que Google Home entamera son arrivée en Europe au printemps prochain. Le Royaume-Uni est le premier pays à accueillir l'assistant domotique de la firme de Mountain View. Cette arrivée est-elle de bon augure pour un lancement en France...
15 Mar 07:50

Samsung imagine un abribus qui surprend les passants et discute avec eux

by Creapills

En Israël, Samsung détourne un abribus pour communiquer sur sa montre connectée, qui se met à discuter avec les passants.

Les panneaux publicitaires sont partout dans la rue. On est tellement habitués à les voir, que parfois, on n’y prête même plus attention. C’est sur ce fait que Samsung a voulu jouer pour sa nouvelle campagne publicitaire mise en place en Israël avec l’agence Leo Burnett.

À Tel Aviv, la marque a eu l’idée de détourner un abribus en cachant à l’intérieur un vrai complice. Alors que les passants attendent le bus, le passant se met à interagir et à discuter avec eux. Une promotion surprenante et amusante imaginée par Samsung pour faire la promotion de sa montre connectée Galaxy S3 Frontier. Bien vu !


Source Samsung & Leo Burnett Israel
Via adsoftheworld.com

Cet article Samsung imagine un abribus qui surprend les passants et discute avec eux provient du blog Creapills, le média référence des idées créatives et de l'innovation marketing.

15 Mar 07:34

Scene Fusion Brings Multi-User Development To Unity’s Editor VR

by Jamie Feltham
Scene Fusion Brings Multi-User Development To Unity’s Editor VR

Back at GDC 2017 at the start of this month Unity gave us a brief update on Editor VR, its new platform for developing 3D content within headsets like the Oculus Rift and HTC Vive. The company showcased some of the new tools that were supporting the platform, including Scene Fusion from KinematicSoup Technologies

This week, KinematicSoup is announcing official integration with Editor VR as part of the platform’s latest update. That means any of the 6,000 plus people that have already downloaded the software in its alpha state can access this new tool, which lets multiple people collaborate in real-time in multiple ways. Think of it like multiplayer VR as you’re actually making your game; using position tracked controls users can intuitively build a scene together.

This has a ton of potential for optimising VR development, and you can see why in the trailer above, released in January. Not only can two people enter VR together to edit scenes with each other, but one person could go into VR while the other works on the game from the traditional screen. The studio describes this as the out of VR developer doing ‘Big Picture work’ while the in-VR developer does ‘detail work’ and provides feedback.

KinematicSoup says the app could make VR development much more efficient and speedier. What really captures out attention, though, is the possibilities Scene Fusion could have when combined with other Editor VR Tools like Tvori, allowing people to collaborate not just with object placement but other creative aspects.

Scene Fusion is available as a premium non-VR plug-in for Unity too, but Editor VR hasn’t really been established as a monetizable platform yet, so it’s unclear when or how Tools developers will make money.

In fact there’s still a lot to learn about Editor VR as a whole, but Tools such as these suggest Unity is well on its way to making the accessible development platform it envisions for the future, whether people are making VR or non-VR content.

Tagged with: Editor VR, KinematicSoup, Scene Fusion, unity

15 Mar 07:32

Hayo’s Kinect-like sensor for your home now has an API

by Romain Dillet
 Sometimes I miss the Kinect. It was such a breakthrough device, yet it didn’t go anywhere. But it inspired a lot of entrepreneurs, including the team behind Hayo. This new device lets you set up virtual buttons on a table to control all your connected objects. Hayo recently announced there will be an API so developers can play with the company’s image recognition technology. Read More
12 Mar 19:22

What Happens If You Throw a Magnet In Copper Pipe [Video]

by Geeks are Sexy

Most of you probably have already seen videos of similar experiments, but this is so cool that I just had to share it with you guys.

[Oleksandr Hech]

The post What Happens If You Throw a Magnet In Copper Pipe [Video] appeared first on Geeks are Sexy Technology News.

12 Mar 19:16

Wizards of the Coast is releasing an official app to support your D&D campaign

by Andrew Liptak

Stacks of papers are a staple at any regular Dungeons & Dragons game, but Wizards of the Coast is finally offering a digital client to help gamers manage their characters. Today at PAX East, the company revealed a new “Digital Toolset” that works with Fifth Edition called D&D Beyond.

Billed as a game management tool, D&D Beyond has an official rules compendium, a character builder and character sheet. From the looks of it, it’s designed to make it easier for players to look up rules and update their characters as they play.

The company partnered with Curse, a media company owned by Twitch that builds gaming management tools for a range of platforms. D&D Beyond appears to be a client designed to bring the traditional pen-and-paper...

Continue reading…

10 Mar 07:11

7 Things We Can’t Wait To Stick The Vive Tracker On

by Jamie Feltham
7 Things We Can’t Wait To Stick The Vive Tracker On

I sort of see the new Vive Trackers as like the post-it notes of VR, in that I want to stick them to absolutely everything. I often walk down the street looking at things you could virtualize by clamping HTC’s new peripheral to them. In fact I probably scare people when they see me contemplating a trash can and talking to myself about what it could do for VR.

Okay, that doesn’t really happen, but the Vive Tracker does open up a new world of possibilities for SteamVR. The kit, which is essentially a piece of plastic that your base stations will track the position of just like they do your headset and controllers, is already shipping out to developers, and we’ve seen a handful of ideas from the community.

But we’ve had a few ideas of our own. Here’s what we’d like to stick the Vive Tracker to when it starts to roll out later this year.

Instruments

When you think about it, Oculus and Harmonix thought up the basic concept of the Vive Tracker a long time ago, when it stuck a Touch to a plastic guitar and gave us Rock Band VR. We’ll find out how they worked out for the pair in a few weeks, but the Tracker could actually bring our real instruments into VR so that we could play them in front of scores of adoring virtual fans. There’s a lot of potential for a Rocksmith-style VR tutoring game here.

Cups

We all know playing VR games can be physically demanding — my shoulders still hurt from my latest round of Paulo’s Wing — so it’s a pretty good idea to stay hydrated throughout. Removing your headset and finding your icy beverage is annoying; why not just have it in the virtual world too? Note that this won’t solve VR’s awkward angles issue where most cups won’t actually be able to tip into your mouth without hitting the bottom of your headset. Forget positional tracking; tell John Carmack this is where the real innovation is needed.

Phones And Tablets

I’ve already seen a pretty cool multiplayer VR game that stuck a Vive Tracker to a phone, but I’m talking about bringing those devices into the virtual world with the user. Smart devices could be incredibly versatile input mechanisms for specific VR experiences. Imagine what it could do for exisiting experiences like Tilt Brush, allowing you to fine tune your creations with greater depth. Plus it would be great to quickly flick through your phone in VR.

Toy Lightsabers

Since 1977 humans have dreamed of what it would feel like to hold a real lightsaber and do glorious battle with evil forces. We came pretty darn close in 2016 with Star Wars: Trials on Tatooine but we could come even closer with the Vive Tracker stuck to a plastic sword. Imagine two of these things in the hands of two friends having what felt like a real lightsaber battle with tactile feedback. Okay, better make them out of foam instead. No headshots, either.

Friends

We’ve already used the Vive Trackers to bring our own bodies into VR experiences, but what if we put them on our friends? What types of experiences might be possible? Imagine the kinds of co-op games you could make in which friends could work with you and have a real presence in the virtual world. Again, you’re probably dealing with a lot of health and safety concerns, but this is definitely something that games like Keep Talking And Nobody Explodes should consider for future updates.

Pets

Some readers with a friendly cat or dog will probably be familiar with the stomach-churning guilt that comes with accidentally kicking them while walking around in VR. It’s all too easy to forget that a four-legged creature might be trotting up to say hello while you’re jumping around in your living room. I would happily pay $100 not to have that experience again and attach a VR life jacket to my pets. This might also be good for babies.

Oculus Rifts

The Tracker might not be the perfect answer but it’s certainly one possible way of achieving a larger room-scale setup for your Rift. As Valve’s tracking hardware continues to evolve it would be great to give Rift owners the option to buy some base stations and a few trackers to get larger room-scale tracking on their headsets.

Tagged with: htc, htc vive, Vive Trackers

10 Mar 07:10

Blockchain Will Change Far More Than How Your Money Is Managed

by Benjamin Stecher

Organizational Upgrade

The times they are a changin’. Spurred by technological innovation, our world is hurtling forward at unprecedented speeds and it is altering how we work, play, and live. However, our organizations, institutions, and governments have not been able to keep up with the pace of change. Too many of them were built on outdated systems that never could have predicted the impact things like the computer or the internet would have. Thankfully our methods for organizing ourselves are themselves finally evolving and nothing has been a bigger breakthrough in that regard than blockchain technologies which have the ability to fundamentally change how we organize ourselves.

blockchain

What is revolutionary about blockchain is that it enables a distributed check on power. Today our systems still rely on centralized authority where trust is given to a select few. What blockchain enables us to do is replace these systems with ones that have no need for a central authority.

The key to the blockchain is that everything on it is monitored by everything else on it. Using bitcoin as an example, whenever a bitcoin gets exchanged, that transaction gets checked against every other transaction in the system to make sure it is authentic and then gets tied to the history of that bitcoin. The entire history of each bitcoin is stored in an open ledger making every transaction traceable.

aaeaaqaaaaaaaaz8aaaajdcxzjdmnze5ltq2ntktnduyni04ogu1ltg4ytkyntc3mjuyzg

But the real beauty of a blockchain is that it enables people to have absolute trust in whatever it is powering. Trust is the foundation of all institutions, they must earn the trust of its participants and then continue to prove themselves trustworthy in order for people to keep using them. This is why we exchange goods and services for money, we have faith that the currency we traded for will hold its value. Credit, without which no economy can function, is itself defined as “the ability of a customer to obtain goods or services before payment, based on the TRUST that payment will be made in the future.”

What the blockchain enables us to do is to never have to make those leaps of faith. The systems powered by blockchains are inherently trustworthy because there is no wiggle room in the system for corruption, fraud or even human error to play a part as everything is distributed, verifiable and open to anybody using the system. For more on how it works click here.

It is hard to understate the kind of impact that this can have. As stated in the book Blockchain Revolution, “some scholars have argued that the invention of double-entry book-keeping enabled the rise of capitalism and the nation-state. This new digital ledger of economic transactions can be programmed to record virtually everything of value and importance to humankind…In fact, soon billions of smart things in the physical world will be sensing, responding, communicating, buying their own electricity and sharing important data, doing everything from protecting our environment to managing our health. This Internet of Everything needs a Ledger of Everything. Business, commerce, and the economy need a Digital Reckoning.”

563759ace4e4435e811e6c350ab544b9iot-blockchain-fig3-small

More Than Money

Not only are a host of financial services now using blockchain technologies, but it is also being used to write smart contractssecure our digital identities, and even to back marriage. IBM is a leading player in blockchain technologies and recently launched a blockchain powered tool for government organizations to help them gain more trust from their citizens. The technology is spreading rapidly as more and more companies are seeing its merits.

bc

But its real potential is the role it could play in reshaping our political systems. The technology allows for direct democracy where every citizen could vote on issues themselves rather than just electing representatives to decide on issues for us. In addition, if national currencies began using blockchain backed systems then each citizen would be able to track exactly how each of their tax dollars gets spent. It has within it the promise of a world that is incorruptible, completely transparent and one in which anybody who wants to participate can have equal say in how society chooses to order itself.

The technology that enables the blockchain has only been around for 9 years, and like all good ideas, it will take time before its potential is realized. But it does seem to be one of those ideas that has staying power as it holds within it the ability to change how we organize the world we live in.

The post Blockchain Will Change Far More Than How Your Money Is Managed appeared first on Futurism.

08 Mar 19:02

How to Design the Best UI for Room-Scale VR

by Roxana Brongo

As virtual reality becomes more and more mainstream, we, the designers and developers, are starting to build in bullet-proof techniques and…

Continue reading on Virtual Reality Pop »

08 Mar 08:24

Microsoft just released a fully functional, Terminator-style HUD for HoloLens

by Mark Coppock

Microsoft posted a tutorial on creating a Terminator T-800 heads-up display using HoloLens and Unity, complete with face detection and optical character-recognition capabilities.

The post Microsoft just released a fully functional, Terminator-style HUD for HoloLens appeared first on Digital Trends.

08 Mar 08:06

DeepCoder, l’IA capable de coder des programmes simples à la place d’un développeur

by Rude Baguette

Des chercheurs de Microsoft et de l’université de Cambridge ont créé une IA qui peut coder sans intervention humaine des programmes en combinant des lignes de code récupérées sur des logiciels existants : un pas vers une automatisation du travail des développeurs ?

Microsoft et l’Université de Cambridge ont développé une intelligence artificielle nommée DeepCoder qui utilise des techniques d’apprentissage automatique (machine learning) pour coder des logiciels simples, en combinant des lignes de code trouvées sur des logiciels existants. L’IA vérifie que les combinaisons de lignes de code qu’elle propose sont fonctionnelles, et au fur et à mesure qu’elle créé des combinaisons fonctionnelles, elle peut coder des plate-formes de plus en plus complexes.

Actuellement DeepCoder est capable de produire des programmes opérationnels utilisant jusqu’à 5 lignes de code : cela peut sembler peu, mais 5 lignes permettent déjà d’élaborer des fonctionnalités relativement complexes.

Création de combinaisons de code originales

Un intelligence artificielle de ce type présente l’avantage, sur un développeur humain, de travailler plus rapidement, de synthétiser une masse de données plus importantes – et donc de trouver des combinaisons de code auxquelles un développeur n’aurait pas forcément pensé. En s’améliorant, elle permettra d’automatiser le travail le plus rébarbatif des développeurs.

L’objectif officiel n’est évidemment pas que les codeurs humains soient remplacés, mais qu’ils puissent se dégager des tâches que les IA peuvent accomplir à leur place, et ainsi avoir plus de temps à consacrer à la partie la plus complexe de leur travail.

Un risque pour les développeurs d’être (partiellement) remplacés ?

Cela étant, Marc Brockschmidt, l’un des créateurs de DeepCoder, affirme que les futures versions de son IA pourront développer des logiciels capables de récolter des données sur les sites web ou de catégoriser des photos sur les réseaux sociaux : dans ce cas, aucune intervention humaine ne serait nécessaire.

"DeepCoder, l’IA capable de coder des programmes simples à la place d’un développeur" by Rude Baguette originally appeared on Rude Baguette

07 Mar 23:41

Room Scale MMO OrbusVR Has Been Funded On Kickstarter

by Charles Singletary
Room Scale MMO OrbusVR Has Been Funded On Kickstarter

As the software ecosystem grows, developers are starting to find a balance that allows them to create solid multiplayer experiences on VR platforms. The community is welcoming massive amounts of VR players, but the MMO (Massively Multiplayer Online) genre hasn’t really found a solid foothold just yet. Enter OrbusVR, the Kickstarter project looking to bring room scale combat, exploration, and crafting in VR MMO format.

You can see the trailer for it below, featuring a simple visual style and action-packed gameplay.

At the time this article was written, OrbusVR has accrued a funding amount of $23,997 on a goal of $10,000. Though this may seem meager by the standards set by other well known successful games funded on the crowd-funding platform, OrbusVR is no less ambitious with its plans. The gameplay teaser shows huge groups of players fighting in fields utilizing medieval fantasy weapons, exploring caves, teaming up against large boss characters, and even some fishing— the minigame that absolutely no MMO should ever go without.

“The support of this game by the growing community of VR enthusiasts continues to surpass our expectations at every turn,” says lead programmer Riley Dutton in a press release for the project. Dutton is also the Founder of Ad Alternum, the dev team making the game.

The game is adopting a low poly style that could be endearing to fans or a turn-off but, considering it was fully funded plus some, there are clearly fans interested in the project. The game is also not just in conceptual stages. The footage in the trailer is collected from 3 play tests in which 1500 players have played and things look to be quite functional from what we see.

“Our Stretch Goals represent things that we want to do in the game eventually, but didn’t have the resources to commit to doing before launch,” said Robert Dutton, Level Designer. “We’ve intentionally kept the scope of the project reasonable so we can meet our deadlines, but with the extra help from the community, we can do more than we originally planned, which is great.”

OrbusVR is slated to be released on HTC Vive and Oculus Rift. The Kickstarter is ongoing until March 13th and, while early bird funding bundles are taken, there are still many contributor awards available such as closed Beta and closed Alpha access.

Tagged with: mmo, OrbusVR

03 Mar 22:49

Facebook Messenger Platform v1.4

by Jin (Hitoshi) Tanaka
Facebook messenger platform.

Today, Facebook announces Messenger Platform v1.4.
Since Facebook gave a great vision and platform strategy for next 10 year roadmap, a Messenger Platform/ the next main platform which has a backend system with a personalization engine plus Natural Language Processing, Emotional Recognition, Machine Learning, Deep Learning, etc, could be a core system and main platform for developers and a Facebook users.

https://medium.com/media/8d305768bcc70b6272b47cf0e81251f1/href
“How Bots will change the world more??”

We, many Bots makers, Bot Entrepreneurs, Marketing Ninjas, Conversation Flow Designer are trying to catch up, learning each other, making a great Bot for users with awesome Folks collaborating together.
Thanks, guys. However, all of us need to listen up and check closely now.

As everyone is learning its update today, the lists below are the things we need to know at least, otherwise, you will be behind!!
We, the Foxsy team already tested its updates in order to improve our Bot, Foxsy. We could share some insights with you all today.
I hope it’s useful for everyone.

Here, it is very important updates for all of us.

Messenger Update, v1.4

1. Persistent menu (Main menu)

Firstly, the persistent button is moved to the right size from the left side instead of a Facebook legend “Like” Button.
If you click it, the keyboard is popped up from the bottom.
The pic above is an example. Also, the previous keyboard used to have a persistent menu always a left side on the keyboard.
However, it is changed and gone somewhere.

Secondly, there are three layers from a single menu with 5 menus bar.

- Up to 3 levels of submenus
1. Main, 3 menus.
2. Second, 5 menus.
3. Third, 5 menus.

In total, it will be 75 menus like pulldown menus.
Mmmmmm, not sure how to design this to guide users correctly and clearly.

2. Update issues

Even Facebook has a great update today, there are some update issues here. For instance,
Who and when can get upload is based on this information below.

- Developer side implementing new functions
- Accounts (Developer or just users) Time, Mar 2nd
- Mar 20th for all users in public.

If you are all set,

it means that as we update our Bot or users update their messenger app, we, Bots makers need to concern about these situations. Otherwise, it causes a really bad UX based on a new UI. It’s tough topic for us.

“ Be sure to use Messenger v106 or above to test your bot with the new design. As of March 2nd 2016, all bot developers will see this menu. To allow bot developers time to update, public users will not see the new menu design until March 20th. Your bot’s menus are cached by Messenger but updates are fetched periodically. If you update the menu, you can trigger this fetch by deleting the bot’s thread and then beginning a new one.”

If a user doesn’t have a current version of messenger, even if you created the new version of persistent menu, it shows this menu even though a number of menus are just three. Well, how we could change it better?

3. Pic size

It could bring a better visual information. In our case, a profile pic could be clear and much bigger than now. We might change it in the future.

- horizontal image aspect ratio is 1.91:1
- square image aspect ratio is 1:1

What do you think about it?
However, this size changes cause the different type of problems, such as quick reply is always here and the contents structure will be changed too.

There are a quick reply, the Keyboard, a new persistent menu, and just normal contents.
“Wait!! It is easy or too much information, for sure.”
Well, we are developers. Thus, we need to say to a Facebook developer team, if you don’t like it, please leave and give me comment, all together voice could change and developer to a Facebook team.
Also, we, the Foxsy team is joining FbStart program. So, we could reach out a Facebook team, I think.

4. API access

Now we can access better date to analyze more for user’s behavior.
For instance, this is the type of numbers you could access.

Facebook analytics is getting more powerful than before.
  • Daily unique active thread counts.
  • Daily unique conversation counts broken down by user feedback actions, including
- messaging turn-ons
- messaging turn-offs
- thread deletions
- report spams
- others

5. Bots Discovery

It will be a big change, and that’s we really need it. So far as I used and tested it, it has been just like spamming discovery function too random.

- Localization support (it’s supported now)

In this point, it is like a filter for users to be able to find a proper preferable country based on a Bot, also for developers they can set up their API to get discovered based on countries by users so that they can reach out a specific type of users and segment too.
Even though still it widespread area. For sure, Facebook will bring and make it better so quickly, I wish. I believe you guys, Facebook.

6. Share function

Foxsy has a user’s status and tracking ID to give them a referral. However, it has not been perfect yet. Because this part has issues for two share link and share button to a receiver.

- Able to delete “attribution link” which is the link button on the top of shared messages.
- Able to share from web view
- Able to send different contents from bubbles attached share button.

7. NLP or template based on UI?

We might need to change Foxsy.
- Ability to turn off freeform input

Also, based on a new update, there may be two types of Bots will be shown up. Once a user updates their messenger, this screen is shown up.
They are ready to use it to discover a new Bot experience.

- Swelly type of Button and template based on UI.

(Swelly)
Swelly is obviously the best and largest number of users and succeeds Bot ever. However, my friend, Peter told me that Swelly team doesn’t focus on developing command and text based on an interaction and a responsive. That’s how they make and develop an easy and user-friendly interface and experiences too.

- Hi Poncho weather Bot.

(Poncho)
Poncho is well known friendly weather prediction Bot like a friend.
Based on user’s command, Poncho responds very quickly with a great humor.

- M, run by Facebook.

(M, Facebook)
M, everything is based on a free text. A user can type whatever they want.
Based on rumor and my friend’s experience (coz, I haven’t used it before, that’s shame)
The backend system is the hybrid with the system and human. That’s how they bring a new user experience like feeling talking with a human.

“Make meaningful and beautiful connections in the real world”’

Facebook Messenger Platform v1.4 was originally published in Chatbot’s Life on Medium, where people are continuing the conversation by highlighting and responding to this story.

02 Mar 06:50

Microsoft starts shipping mixed reality developer kits

by Deniz Ergürel

Microsoft’s mixed reality support will also arrive on all versions of the Xbox One

Continue reading on Haptical »

02 Mar 06:49

Forget ‘user-friendly’ bots and focus on advanced usefulness

by Toby Barnes

Designing bots to be useful and engaging over simple being “user-friendly”

Continue reading on Chatbots Magazine »

01 Mar 21:19

GDC 2017: Epic Games Unreal Engine VR Editor Coming in April With New Features

by John Gaudiosi
GDC 2017: Epic Games Unreal Engine VR Editor Coming in April With New Features

Epic Games is using the Game Developers Conference (GDC) to give an advanced preview of the latest additions to its Unreal Engine VR Editor, which allows creatives to build worlds in a virtual reality environment using the full capabilities of the editor toolset combined with interaction models designed specifically for VR world building. The goal is to officially launch the new VR Editor by April 17.

Mike Fricker, technical director at Epic Games, told UploadVR that working directly in VR provides the proper sense of scale necessary to create realistic, believable worlds, while the use of motion controllers means artists and other non-programmers can build environments with natural motions and interactions.

Epic’s own Robo Recall team used the VR Editor to build out the free pack-in game for the Oculus Rift with Touch, which also makes its complete debut at GDC this week.

“As soon as they started using it, they realized what the most beneficial use cases were to them,” Fricker said. “Inspecting and tweaking was one of them, but sometimes they just want to throw in things really quickly and see it at scale without having to constantly take the headset off and on.”

The Robo Recall team had a direct impact on the new VR Editor that everyone will have access to in April. Fricker said the team needed little power user features like the ability to snap objects right to the ground instantly without having to go grab them from a menu and move them down to the ground.

“They asked us to give them the power to use these additional features so that they can stay in VR longer,” Fricker said. “That’s not to say that we’re trying to replace desktop. If they’re going to go and do blueprint scripting or material editing, you can get to that stuff in VR and you can make some progress if you knew you were going to tweak something or make a quick change to something. If you’re going to develop a function library or a new game system, you’re probably not going to do that in VR today. But the fact that you can go and see it and inspect it without having to leave VR, that’s the feedback that we got from the team.”

Developing inside VR not only opens things to all members of a team, it also speeds up the development process.

“It’s much faster to navigate a scene in VR than it is with the desktop, where you’re constantly using the combinations of the mouse and keyboard and modifier keys to orbit around an object and zoom the camera around,” Fricker said. “In VR, it’s one-to-one. I know exactly where I’ll end up at any point. Once you get used to it, it’s super fast.”

Lauren Ridge, tools programmer at Epic Games, said they’ve put in safeguards to ensure developers don’t get sick working within VR. For example, you can only move in rotation towards one direction. Not a single Epic user has ever had any motion sickness problems while in the VR Editor at the studio, where high-end PCs ensure a fast framerate.

“We have various levels of safeguard settings that will do things like turn on a grid for my tracking space or dissolve the sky into grayness,” Ridge said. “For example, in real life, I don’t have the ability to grab the world, turn it like a steering wheel and see the sky change. To some people, that’s instantly not good, so we’ve looked at all the different cases people have and added safeguards for them. You also can’t tip yourself over.”

Ultimately, the VR Editor has been designed to allow creatives to do whatever they want. Epic showcased a complicated scene set on a beautiful beach during its GDC Keynote, which includes a surfing mini-game as well as a sea plane flying overhead. Moving the plane to a higher altitude is done in seconds by grabbing the plane and moving its trajectory.

“We’ve been improving things since last year, which was the equivalent to our early access,” Fricker said. “We know that navigating 3D spaces is really fun and fast in VR, so that’s another cool thing that we’re excited about.”

The GDC beach demo also shows how easy it is to access the Unreal editor UI in VR to change settings or change what types of plants you’re painting down for foliage painting. The brush has been improved and makes things like undo and redo more accessible with a quick action.

Simulate mode allows developers to see how objects act when physics are attached. Ridge shows rocks of different sizes accurately falling off a cliff that overlooks the beach.

“This means you can use physics as an art tool,” Ridge said. “When you move the rock around gravity will act on it. You can also trigger gameplay events.”

The demo shows accurately built wooden 2x4s being snapped together into a staircase for a wooden hut on the beach.

“We also added more precise snapping tools,” Fricker said. “That’s about having things look organic and natural, but we also wanted a way to have really precise interactions with objects.”

Epic is taking advantage of VR, which offers more degrees of freedom with motion controllers than when using a traditional mouse and keyboard.”

“If I paint using different pressure on the trigger of the motion controllers, it’ll paint different strengths of the rock material down,” Ridge said. “This is cool because the editor already had various painting and fluid creativity features, but then being able to use those with motion control suddenly made them way more accessible. I can instantly get the bird’s eye view and see how it looks all in the scene and then jump down to see the player’s view of it to make any changes.”

Epic has also partnered with Disney’s Pixar Animation Studio to have Unreal Engine 4 and the VR Editor support Pixar’s Universal Scene Description (USD) 3D graphics pipeline. Epic showed the coral reef from Finding Dory and characters Crush the sea turtle and Mr. Ray the manta ray running in UE4.

“The cool thing here is that we don’t need any other separate tools to go from USD to what you’d see on screen with this demo,” Ficker said. “USD is a pretty big deal to the film industry and other non-gaming uses, but it has some special powers that make it equally awesome for games too.”

Pixar wants to add more plug-ins for creatives beyond Autodesk Maya, so UE4 now opens up new opportunities for companies working in VR.

“As more plug-ins appear, more people will begin using this format,” Ficker said. “USD has a really elegant set-up for just describing a scene in its entirety with all the information you need to uniquely instance specific things along with dealing with complex animation.”

“We know the film industry will like it,” Ridge added. “We will increasingly use USD here. Hopefully, we will keep working with Pixar to make it awesome for every use case we can imagine. Right now we are working on USD import, but at some point we will probably be able to generate USD files as well.”

Tagged with: Epic Games, Unreal engine, vr editor

01 Mar 21:17

Would Google Glass-style augmented reality work better on a smartwatch?

by Chaim Gartenberg

Augmented reality is a tough trick to pull off in the real world, and one only need to look at things like the Google Glass to see how hard it is to make wearable technology that both enhances life digitally without awkwardly intruding on the real world. The WatchThru is a concept created by a team of researchers from University of Bremen, Google, and Hasselt University that offers a different spin on AR with an intriguing expansion to traditional smartwatches: a second transparent display for augmented reality interactions.

The research team has a couple of interesting use cases for the additional screen. For example, having the glass show a directional arrow to help navigate around a campus, while a more traditional map display is...

Continue reading…

28 Feb 23:33

This $100 Star-Lord helmet has built-in Bluetooth headphones

by Chaim Gartenberg

Guardians of the Galaxy Vol. 2 hits theaters later this year, and part of the oncoming wave of toys and tie-ins that Disney will be launching is this truly spectacular Star-Lord replica helmet. For $100, you’ll get a light-up version of the glowing helm from the film complete with sound effects. Oh, and it’s also a fully functional Bluetooth headset. So you can not only look like Star-Lord, but rock out in true Peter Quill style.

Do they sound good? Probably not. At the end of the day, it’s still a plastic toy helmet. But like the $100,00 diamond headphones, the Star-Lord helmet isn’t about specs. It’s about style. Throw in a fancy leather duster and some boots, and you’re basically Chris Pratt.

Most importantly, the...

Continue reading…

28 Feb 20:53

With 6,000 Downloads, Unity’s Editor VR Is Making VR Dev Accessible To All

by Jamie Feltham
With 6,000 Downloads, Unity’s Editor VR Is Making VR Dev Accessible To All

Unity launched its Editor VR service in an alpha form late last year, and the engine-maker appears to be pretty happy with the results so far.

Editor VR is an in-VR section of the larger Unity engine that lets you edit virtual scenes in games and more. Taking to the stage at the company’s GDC 2017 press conference this morning, Principal Designer on the Labs team, Timoni West, revealed that the platform has had over 6,000 downloads since its launch. That might not sound like a huge amount, but given the platform’s alpha phase it’s a good start. Some of those users have already contributed code back into the platform’s code base too.

West then showcased some of the Tools that are now included in the alpha version. These are creative apps and middleware that are imported into the Unity engine, to literally provide developers with more tools to make VR experiences. They include animation apps like Tvorii, and ProBuilder, which allows you to build, texture and edit meshes inside VR.

The company wants yet more Tools, though, and to encourage developers to contribute their work, Unity is hosting a competition with an undisclosed cash prize and a chance to showcase at May’s VR and AR-focused Vision Summit.

Unity isn’t just leaving the development of Editor VR up to the community, though. West’s biggest announcement was its own Tool, the Cross Reality Foundation Toolkit (XRFT). This was described as a “framework” for just about anyone interested in working in VR, AR and MR, allowing them to “get up to speed” without starting from scratch.

“We want to give you the building blocks for interaction and locomotion,” West said, “and everything else you need.”

Included in the XRFT will be elements like cross-platform controller input, customizable physics systems, AR/VR-specific shaders and cameras, object snapping and building systems, debugging and profiling tools, and support for all major VR and AR platforms. The toolkit will be released as an open-source beta “over the next few months.”

It sounds like this could be the next big step in Unity’s plan to make VR development accessible to everyone and anyone.

Tagged with: Editor VR, unity, XRFT

28 Feb 12:53

Ce projecteur transforme n’importe quelle surface en écran tactile

by Creapills

À l’occasion du MWC, Sony dévoile le « Xperia Touch », un projecteur révolutionnaire qui rend n’importe quelle surface tactile.

Oubliez votre iPad et les autres tablettes tactiles, car Sony vient de dévoiler une nouvelle innovation qui devrait faire parler d’elle. À l’occasion du MWC (Mobile World Congress) à Barcelone, Sony a dévoilé le Xperia Touch, un projecteur innovant capable de transformer n’importe quelle surface en écran tactile.

Si on avait déjà pu découvrir un aperçu de ce produit à l’occasion du SXSW l’an dernier, le projet « Xperia Touch » semble plus mature que jamais puisque ce projecteur devrait être commercialisé dès le printemps 2017. Le Xperia Touch est capable de créer une surface tactile de 23 pouces qui utilise une caméra et un capteur infrarouge pour détecter vos mouvements qui correspondent à des clics ou à des swipes.

Le Sony Xperia Touch entend bien révolutionner des industries comme le gaming, l’éducation, le streaming ou encore la cuisine. Un bel objet qui fonctionne sous Android et qui enclenchera peut être une nouvelle ère dans l’univers des nouvelles technologies.

Ce projecteur transforme n'importe quelle surface en écran tactile

Un projecteur innovant…

Ce projecteur transforme n'importe quelle surface en écran tactile

… capable de rendre tactile…

Ce projecteur transforme n'importe quelle surface en écran tactile

… n’importe quelle surface

Ce projecteur transforme n'importe quelle surface en écran tactile

Le Xperia Touch de Sony


Source Sony
Via dezeen.com

Cet article Ce projecteur transforme n’importe quelle surface en écran tactile provient du blog Creapills, le média référence des idées créatives et de l'innovation marketing.

27 Feb 21:23

Your Chatbot’s Personality Is The Key

by Spectrm

This article originally appeared on Spectrm Insights

Continue reading on Chatbots Magazine »

27 Feb 18:15

I Kicked A Dinosaur in The Face Using Vive’s Full Body Tracking

by Jamie Feltham
I Kicked A Dinosaur in The Face Using Vive’s Full Body Tracking

The scale for awesome things you can do in VR keeps increasing. First, I could see a dinosaur as if it were really there as it walked towards me, perhaps sniffing right in front of me as he pondered a potential meal. Then I could punch dinosaurs with position-tracked controls, realistically defending myself should they choose to attack. Now? I can kick dinosaurs. In the face. And it’s awesome.

A month ago we saw developer Cloudgate Studio experiment with full-body tracking on the HTC Vive, using two controllers for hands and another two strapped to your feet. The result was a surprisingly accurate approximation of your full body inside VR. At the time the developer noted that it would implement the Vive’s new Tracker peripheral, switching out the controllers on the feet, as soon as it possible. Well, now it’s done that and it looks something like this.

A build of Cloudgate’s Island 359 was on display at this year’s Mobile World Congress using these two makeshift add-ons, which hooked under the laces in my shoes. A belt with another tracker tied around it went around my waist, something the developers said it would implement in last month’s video for a touch more accuracy.

As the game booted up I was instructed to stand with my feet in some green outlines, with a mirror in front of me. Once I was alligned I saw the floating hands and feet transform into a full avatar in seconds, with no other calibration needed. The first time I did this I ended up with strange, elongated gorilla arms (I don’t think I was holding my hands in the right place), but the second time it worked much better. I could look down and see my body, and my arms would follow my hands — for the most part — in a realistic fashion.

There were the expected occasional glitches, with arms getting out of sync, but Cloudgate has only had the Trackers for a few weeks and with a consumer roll out of Vive Trackers not happening until later this year it’s got plenty of time to iron it out even more.

Actually playing Island 359 with this tracking was a lot of fun. In last month’s video we saw the developer kick items over. Here, I was lashing out at dinosaurs with my feet and sending them flying. Smaller dinosaurs would run up to me, and I’d stomp down on them, killing them instantly (yes, there was a guilty pleasure to it). For larger dinosaurs I’d sweep my leg into them. The tracking might have been a little glitchy, but it didn’t fail me.

As I continued to play I got a little more confident with what the tracking could do. I was attacked from behind and instead of turning around to deal with the problem I simply kicked my leg backwards, then turned to find an enemy crumbled on the floor. It made the game’s action more versatile and liberating than it already was.

The key to this small breakthrough is giving players options. At $99.99, not every Vive owner is going to buy a tracker, let alone three or more of them. But Cloudgate’s work isn’t essential to the Island 359 experience, it’s additive. It’s already playable with just your two regular Vive controllers, this just makes it that bit more immersive.

But sharing this work and letting other developers implement it into their own games could create a scalable VR experience that doesn’t fragment the user-base because it does’t leave anyone out.

We’re still some months from finding out just how big of an impact this use of Vive’s Trackers will be, but I’m willing to bet a lot of enthusiasts will be ordering at least three units when it finally starts to roll out.

Tagged with: fully body tracking, island 359, Vive, vive tracker

23 Feb 22:10

WeChat va lancer un concurrent d’Amazon Go en Chine

by Creapills

En Chine, le réseau social WeChat semble vouloir concurrencer Amazon Go en dévoilant son propre concept de magasin connecté et mobile

Visiblement, le concept d’Amazon Go semble bien inspirer nos amis chinois. WeChat, le réseau social le plus populaire dans le pays, vient de dévoiler un partenariat avec la startup BingoBox pour mettre en place des magasins automatisés et sans personnel… à la manière d’Amazon Go.

Ce point de vente du futur n’est pas seulement automatisé : il est également mobile. Ces petites boutiques seront en effet conçues pour être déplacées facilement d’un endroit à l’autre. Tout fonctionne ensuite grâce à des QR Codes : pour rentrer dans le magasin, pour s’identifier et enfin pour payer. Une initiative surprenante quand on sait que le QR Code est un peu tombé à l’abandon en France. En Chine, il reste malgré tout très populaire et permet à WeChat de proposer une alternative sérieuse et plus mobile à Amazon Go. Wait and see.

WeChat va lancer un concurrent d'Amazon Go en Chine

Pour concurrencer Amazon Go…

WeChat va lancer un concurrent d'Amazon Go en Chine

… WeChat dévoile un concept…

WeChat va lancer un concurrent d'Amazon Go en Chine

… de magasin connecté et mobile…

WeChat va lancer un concurrent d'Amazon Go en Chine

… en Chine

WeChat va lancer un concurrent d'Amazon Go en Chine

Ici, tout fonctionne avec des QR Codes


Source WeChat & BingoBox
Via siecledigital.fr

Cet article WeChat va lancer un concurrent d’Amazon Go en Chine provient du blog Creapills, le média référence des idées créatives et de l'innovation marketing.

22 Feb 23:32

Quelles mécaniques de déplacement dans la réalité virtuelle ?

by Jacky Lucien VINCENT

Les technologies de réalité virtuelle (VR) rencontrent un réel succès depuis plusieurs années dans les communautés de développeurs et progressivement, les applications sortent des labos avec des casques destinés au grand public et commercialisés par les plus grandes marques high tech. Nul doute que cette industrie émergente représente l’une des prochaines grandes révolutions technologiques du 21e siècle. Mais l’immersion totale dans un monde virtuel n’est pas encore atteinte et représente le plus grand challenge pour les acteurs du secteur. Les solutions répondant à l’immersion virtuelle de la vision ou de l’ouïe ont atteint un stade surprenant de développement, mais aucun mécanisme de déplacement en revanche se rapproche si près de la réalité pour l’instant.

Les chercheurs se posent donc plus que jamais des questions autour des mécaniques de déplacement dans la VR. Nous sommes limités à la fois par l’espace disponible dans l’environnement réel, par notre corps (fatigue physique, sensation possible de nausée, anxiété) ou encore par le prix des appareils. Mercredi 15 février, nous recevions, à l’occasion du premier Virtual Wednesday, Damien Doury (développeur et game designer VR) et Stanislas Chesnais (CEO de 3dRudder) chez NUMA pour échanger autour des solutions hardware et software développées actuellement pour répondre à cette problématique.

Software

Roomscale

Cette technologie permet de se déplacer dans l’environnement réel et de traduire les mouvements dans l’environnement virtuel. Bien que l’immersion soit forte, cette solution requiert en revanche un espace important pour se déplacer et chacun ne bénéficie pas d’une large pièce chez lui, ce qui pose une contrainte supplémentaire pour les développeurs de contenus.

Running in place (RIP)

Contrairement au roomscale où le joueur se déplace dans la pièce, les technologies RIP permettent de garder une position fixe en courant sur place ; la direction étant contrôlée par l’orientation du casque. Cette solution résout le problème de l’espace mais la position et le sens de déplacement ne prend pas en compte la position à la fois de la tête et du corps. Des solutions émergent comme PocketStrafe, qui permet grâce à un téléphone glissé dans la poche de prendre en compte la position du corps.

Le D-pad (directional pad)

Le D-pad est un joystick qui permet de contrôler le déplacement dans le jeu sans solliciter le mouvement réel du corps. Cette technologie est simple et peut aussi être utilisée avec un clavier d’ordinateur. En revanche elle peut aussi déclencher une sensation désagréable de nausée, le motion sickness, car l’équilibre est perturbé dû à un différentiel entre la vue et l’oreille interne.

Téléportation

Pour contrer cet effet de nausée, les ingénieurs ont imaginé un système de téléportation. Grâce à une visée laser, on peut imaginer se déplacer rapidement dans l’espace. En revanche, c’est une technique critiquée d’une part car elle casse l’immersion et d’autre part car elle peut parfois permettre aux joueurs de tricher en esquivant furtivement des attaques par exemple. D’autres variantes de la téléportation permettent d’améliorer cette solution, comme la téléportation “glissée” ou encore lorsque le joueur “tire” le monde autour de lui (solution particulièrement utilisée dans les jeux de stratégie).

Hardware

Exosquelette

L’avantage de l’exosquelette est qu’il permet d’ajouter le toucher, un input qui s’ajoute à la vision et à l’ouïe. Ceci permet d’enregistrer les mouvements (outputs) qu’il est ensuite possible de bloquer mécaniquement pour simuler la présence d’un objet dans l’espace virtuel.

Omnidirectional treadmill

Apparue en 2014, l’idée de ces appareils est de permettre une expérience en VR complète dans un espace restreint. Grâce à des tapis à friction faible et des chaussures spéciales, le mouvement réel est traduit dans l’environnement virtuel.

Autres technologies

Ces solutions hardware sont des solutions dites full scale. Mais d’autres acteurs ont pris le parti de fabriquer des produits plus légers ou plus facilement accessibles par les utilisateurs. On retrouve des combinaisons traditionnelles couvertes de capteurs, des technologies de reconnaissance de mouvement par vidéo ou encore des joysticks adaptés aux pieds. C’est le choix de 3dRudder que l’on peut utiliser assis et qui évite la sensation du motion sickness en conservant la sensation de déplacement dans l’espace.

Plusieurs solutions pour plusieurs applications

La technologie des casques VR est déjà bien développée et un standard s’est progressivement installé (par le jeux de deux lentilles qui convergent vers un écran scindé). A côté, les solutions aujourd’hui sont nombreuses en ce qui concerne les mécaniques de déplacement. Les standards peinent à s’installer car ces différentes solutions répondent en fait à plusieurs applications spécifiques.

Les designers et développeurs d’applications VR se sont tous penchés sur la question du déplacement. Ils ont créé des solutions hardware et software transcendant la condition humaine. Mais, si ces modes d’interaction répondent de plus en plus au problème d’équilibre et de fatigue, ils sont peu intuitifs ou naturels. Surtout, ils créent des ruptures d’immersion, élément essentiel de ce médium. Les technologies évoluent rapidement et permettront progressivement d’imaginer une immersion totale.

The post Quelles mécaniques de déplacement dans la réalité virtuelle ? appeared first on NUMA Paris.