#Situational #Awareness #Shopping

#Situational #Awareness #Shopping

Consider, currently we see things we want to buy through advertising or by seeing it in films or when around other people or places.

Why not; while having a coffee with a friend in their house you see a nice bowl and you say ‘buy bowl’. Your personal IoT ecosystem checks the area and finds three bowls, it asks ‘white bowl’ you say ‘Yes’ the bowl is ordered based upon your personal preference which could be Speed, Price, Colour or anything else, for this scenario it’s Speed it locates the nearest supplier and orders it for immediate delivery. You carry on chatting and the bowel is delivered to your home and waiting for you when you get home. Payment is automated, you unpack look at the bowel and say ‘Great Condition’ feedback allocated.

There are more scenarios in our Open Networking Ecosystem Protocol Patent which will be published soon.

Situational Awareness Shopping #UX

I’m just going to get this out there because there is a great deal of lying going on that IoT does not affect the UX profession and E-Commerce business.

IoT system design does not require UX wireframes as the are no GUI’s

The IoT is a complex ecosystem that not only changes interactions but also removes many of the common processes that have been adopted by people to use technology.

Situational Awareness Shopping #UI

Graphic User Interfaces are not a consideration for the IoT as the interactive methods used to select and buy are no longer through container websites, advertising (as a separate activity), payment gateways or any other existing copy of a shop.

digital versions of shops are irrelevant in a society run through situational awareness.

Tagged : / / / / / / / /

#Situational #awareness drives open #IoT #Ecosystems not #visual #interfaces

The foundation for this thinking goes back to a notion of the ‘social life of things’. If things themselves exist and have a number of trajectories and states then those things also potentially have accessible and useful human touch points in the IoT.

Much of the interactions we humans have become used to are in fact simple touch-points to hidden and complex interactions within dispersed and non-interlinked (at the core) technology systems. This simplification process of creating a directed visual presentation layer enables us to maintain a simplified mental model around our interactions. However in IoT technologies the additional integration of voice, touch and thought require a full understanding of the primary cognitive models for each IoT device and an associated and integrated cognitive model, possible clashes or drop outs and load descriptions (for each constantly changing eco-system) by Thing and Cognitive Group. Only then can an interface be defined.

Situational networks with IoT devices services and humans

Situational networks with IoT devices services and humans

Above is a visual description of a set of Things available with a person walking through them projecting themselves, a simple human journey. However working in a local model gets the notion of Things and Cognitive Groups across. Each colour group represents a Thing, attempting to get our attention, each Thing does something different, a different set of interactions, activities, behaviours and outcomes. They can talk to each other or ignore each other. The person traversing the real world and IoT ecosystem walks through several fields of interaction, each time they enter a new field it communicates to them, availability, interaction, messaging (branding, cries for attention, warnings etc.). The first position P1 three touch-points seek engagement, by P2 it’s six touch-points, in P3 five touch-points seeking engagement.

There is no requirement for visual interfaces, in fact audio, smell or touch (vibration or texture) are more likely and in fact desirable to create the ambience for localised interaction and mental association.

Further the current cognitive models associated with the digital existence of tangibles may need to be reconsidered in the context of the IoT as it amalgamates previously separate constructs. It could simply be that the detailed component view we have constructed around daily interactions is no longer valid and we can simplify not only our interactive behaviour but also our descriptors by moving them to high level (directional and instructional avatar) understood constructs rather than the detailed process models we tend to use to live.

Tagged : / / / / / /