Achieving technological singularity via Artificial Intelligence and Machine Learning - The vision for Online Shop
Derived from an observation of our own developments within the Artificial Intelligence and Machine Learning fields.
This could be a controversial topic but it is one I want to shine a light on. It is derived from an observation of our own developments within the Artificial Intelligence and Machine Learning fields. The thought for the topic came to me after witnessing substantial results via experiments and tests that have taken years to conduct and to refine, where data is fed to output perfectly dynamic results, that we aim to be guaranteed and precise.
For those in the unknown, our aim at Online Shop Inc., is to continuously develop our theoretical model within the confines of practically and commercialization by employing the fields of Artificial Intelligence (AI) and Machine Learning (ML), referred herein as an 'engine' for simplicity. Our purpose is to vie for perfect optimization of shop instances our users create, to improve overall customer experience, and for the users running such instances, a chance to increase overall basket value and conversions without the need for human input. Imagine receiving a 10-star NPS (Net Promoter Score) rating without much effort at all. This is a future that I personally see as being the best fit for such application of our engine.
Whilst generating art or doing one's homework is a novel concept, and a useful one, that sparks endless possibilities of what may lie ahead within our minds, it will ultimately require commercial application for continuous development and evolution; a necessary 'evil' for an everlasting renaissance of technological growth. Standards that can only be tested, analyzed and adopted through quantifiable means of cheques and balances.
For those shaking their heads in disagreement, I would like to note that nothing which truly matters in our modern day society comes free. Salaries need to be paid, and servers need to run. Energy is expensive, after all. The excitement of receiving something for free quickly fades as capital gets harder and harder to come by, as with no return on investment, the possibilities of new tools, features and so forth, become less and less. Even with the deepest of pockets, some sort of return needs to happen to ensure viability and continued excitement. You're only one search away to find an abundance of real life examples of why this is of most importance; Wardenclyffe tower comes to mind as an acceptable example of why it is necessary to have commercial intent when trying to design, build and to scale things that are truly revolutionary. Or perhaps to bring you forward to more recent times, the tribulations Elon Musk is experiencing with his takeover of Twitter.
If there is no upfront or recurring cost to access such tools, services and features, you are offsetting the cost. For all of the negative flack that Meta receives on daily basis, it is after all, free to use. In order for it to be free something has to have commercial intent - this intent comes from the data you agree to provide for advertisers to use in their ad campaigns for their products and/or services, to be shown to you. It is a small price to pay some might say, others may disagree. I personally commend Mark and his team for what he was able to achieve, and the standards he has set in order to make platforms such as Facebook, Instagram and WhatsApp free to use. Operation, maintenance and continuous development of such platforms is anything but cheap. At least with Meta, you at least have some basic understanding of what is being given up in exchange for the ability to use their tools, services and platforms - could you say the same about the everyday services you use that preach privacy, and yet are free?
So perhaps now you can understand where my thoughts on this subject lay, and the reasoning behind them.
We made conclusions that for us to achieve the full potential of our perceived perfect implementation of such a model into the engine we had developed, and to continuously develop, a reset button was necessary, thus we decided to pivot, which was a highly controversial move. We were a well performing marketplace focusing on the United Kingdom. It wasn't an easy decision. It forced us to become more resourceful than ever, we stand by our good intentions, and as such decided to pay back all investors and and purchase our intellectual property outright from our very own company to unwind our British operation and reform in the United States as a new corporate entity. I tried to summarize our story within this post; "The Story of Online Shop", which should hopefully give you a better understanding of how we started and what we are trying to achieve.
Both myself and my co-founder Siraaj understood that in order for us to revolutionize, innovate and most importantly scale without flaw, we would require to feed our engine with pinpoint accurate data from the get-go. Unfortunately trying to devolve to a shop creation suite from a marketplace, the original concept before our marketplace, was a monumental if not, impossible task. For one, it would create immense flaws in the design itself, which at scale would prove troublesome to keep continuous development without much hindrance. This is why we refactored almost every line of our original code.
The current version of our engine, that took us months to develop, refine, optimize and is still in development, came from an idea that was planted like a seed inside of my mind after witnessing and experiencing the power IBM’s Watson was capable of.
The original concept behind the engine was aimed to help and support advertisers and in-house marketing teams in pursuit of better ad campaigns, quality of life features that are automated and much more. You can find the original post here that touches on the first theoretical possibilities using Watson to help advertisers: "IBM and the Next Generation of Advertising".
Over the years I continued to develop and expand on these thoughts, ideas, concepts and models, both in theory and in practice, fetching varied and mostly inconsistent results across a spectrum of different offerings; from social media management tools to product management software. It could provide the automation necessary for some objectives but it wasn’t enough. I wanted to develop a model which would showcase the full power of such an engine - one that is truly automated, applicable to everyday use and requires no human input. With so many different areas I was able to experiment, a lot. But it always went back to the original Eureka moment of it being the most applicable in e-commerce.
When the sole vertical was focused purely on e-commerce, the dynamic text functions for example, made perfect commercial sense for marketplace users who had thousands of products, hundreds of thousands of variants and no time. It allowed for rapid generation of headlines, descriptions and the re-writing of existing ones, to be more 'human', and friendlier to the marketplace customers. The same was not of much use for a social media management tool for example, where dynamic text functionality isn't as necessary unless you are conducting wide-scale spam on social media.
Another monumental achievement based on our user feedback was our rapidly adaptable Search Engine Optimization (SEO) model, which implemented a lot of complex methodologies, data feeds, live observations via Computer Vision and much more.
From the illustration above you can see the impact such a model had on organic traffic. Beating brand owned terms, at root domain level in this day and age is nearly impossible. This of course was achieved through a combination of several models within our engine, with another being media compression and resizing without loss of quality to be served for various display sizes.
Of course as a marketplace this brought some disadvantages, as per illustration above you can hazard to guess what the response from bigger retailers was, it was not a positive one. It meant that retailers would be outcompeted on their own products, meaning for every sale made via our marketplace they would be charged 3% on top of the product sales price.
While it was not popular among bigger retailers, for our smaller vendors, this was an amazing opportunity to gain more sales without any additional cost. But as mentioned in "The Story of Online Shop", running a marketplace was in the long term a monumental challenge, and one we started enjoying less and less. But we knew what we had developed, and were developing was truly revolutionary.
Both I and Siraaj decided this was the opportune time, even if it would mean we'd become destitute in pursuit of our vision. It is a necessary, and perhaps even a small sacrifice to do what has not been done before.
Our conceptual idea is rather simple when deducing it to it's simplest laymen form; use our engine, the combination of what we have designed, developed and executed within the field of AI and ML, to empower our users.
Some of the same aforementioned functions are getting expanded upon, and upgraded even further. Unlike our competitors who will not be able to develop their own models quickly enough to compete within the market, we have the luxury of a complete 'reset'.
To create a fully scalable model it is necessary to input precise data from hundreds to even thousands of touch points at scale into our engine, to ensure precise function. Each algorithm, function and line of code that was first conceptualized and input by a human, can be updated, optimized and scaled based on learning models and methodologies that allow for it to do so. This is the only way to create a truly scalable ecosystem that supports dynamic input and output.
Many within our space do not have the luxury of a reset, a perfectly designed start, or ability to refactor code quickly or adequately enough to allow for seamless transition. Many of our competitors operate on strings that are held together by tape, introducing radical change takes time, not to mention bureaucracy created by managerial departments that stifle any efforts to do so, sometimes even grinding it to a halt, ensuring competitive advantage within the market is lost. It is my belief that startups that embrace and adopt such models, new radical ways of development, growth and scaling provided by leeway of a startup will thrive.
How Does It Work?
Okay, so you registered for an account and chose a theme for your shop initialization. You adjusted the allowed settings, uploaded your products, logo, brand images, and came up with some stellar copy.
You're excited and start sharing your new shop across every possible medium you can think of. You're getting the traffic, but there doesn't appear to be any products being added into shopping baskets, no wishlists being made and no sales!
If you were with any competing service you'd probably feel discouraged right about now, not to mention all the time you have wasted, and even money for additional plugins, design services and upsells that appear to happen every time you go into a new page that promise to help you 'sell more'. Unfortunately for you, our competitors negate the basic principles of good user experience, or even support. Ironically many of them laud being the best for customizability and freedom. Perhaps so, but overabundance of tools, features and even the forced need to rely on third party plugins, for any new user, is extremely discouraging and disheartening.
Thankfully our proprietary analytics solution has captured enough data to feed it into our engine to start making optimizations. No human input necessary. No tiring A/B tests. No more smashing your head on the keyboard trying to think of a witty call to action to draw in users to convert.
With enough data points covering the basics of locale, device type, screen resolution, browser, language, operating system and the more advanced points that can only be precisely captured through specific tracking of events in combination with heatmapping, the ability to output an analytical and data driven view of every single element of a shop, and every event that took place, is created.
With the addition of behavioral, affinity and interest based data our engine is able to dynamically change every aspect of a shop, to be as unique and as user intuitive as possible; reducing all possible friction between the user and their decision making.
These changes are completely dynamic and in a constant 'limbo' state. This is achieved through 'jailing', our way of constricting human input from changing crucial elements of a shop, such as insertion of custom code, which can significantly impact the operational functions of the engine. This jailing ensures that every instance is bound by the same standards and rules from the get-go, and is able to evolve and of course, scale.
One of the biggest benefits of this model is the completely automated process of analysis, design and optimization which at this point requires significant human input, vigilant monitoring and is prone to, human error. Not to mention the time required, and capital, should one hire a design team, system admin, marketing manager, and additionally necessary spend on third party tools and plugins. This model completely negates any need of the aforementioned.
One drawback is the computational power that is currently required to process all data captured for analysis and execution - with biggest strain being heatmapping and the amount of power required to make the necessary calculations to ensure that every single bit of data is meaningful enough to properly optimize. Right now the model is locked to a set group of visitors and customers rather than to a unique visitor or customer, for example, the engine will only make adjustments to the design based on data collected from 50 users instead of just one, which would mean the shop is optimized towards a very broad audience and not per individual. As more and more adopt our service, the ability to scale this by having a unique instance and profile per visitor/customer is entirely possible, and can extend between dozens of shop instances.
This allows the engine to comprehend 'uniqueness'. In the simplest of terms this means that Customer 1's data will be collected across variety of shop instances that they interact with. Based on this shared data cluster, a 'hive-mind' like consciousness will develop, allowing for every shop that is visited by an individual user to adjust and optimize based on the individual rather than a set group. This means that ultimately, all shop instances will draw benefit and share data as a collective. It is sort of a reverse of both a user and the shop instance, where the current shop instances are started at an individual level, and users at a collective level. This is the only viable way to properly scale, it requires immense effort but we are seeing a lot of promise already.
The above illustration showcases data captured from a repeat customer who has created an account using Meta's Facebook and agreed to share additional data, such as interests. The data above, of course, in practice will be obfuscated and encrypted. Some manual input is still necessary, for example, the engine is unable to accurately predict the amount of discounting which is appropriate to entice a customer to purchase a product as an abundance of data is necessary to solve this via creation of customer categories and variants. This is also not of immediate concern as most users who have a shop, will at first want some control over the engine.
The perfectly precise data captured from a repeat customer allows the engine to adjust every single element of a shop as per the illustration above. From a light to a dark contrast, an incentive to spend more for free express shipping. Movement of the logo, and the brand logo colors beneath created to be visible on the black background overlay. With dynamically generating copy on each visit. Of course this is only one part of the entire shop instance, it is after all dynamic, and every element is adjusted by the engine.
Our Own Singularity
The models encompassing our engine, it’s algorithms and everything in-between is made to capture precise data from hundreds of touch points, specifically developed for the ecosystem we are creating to read, analyze, optimize, predict and to adjust. With everything else in mind, it became apparent to us that in the near future with enough time for learning and optimal adjustments at scale we could reach our own singularity.
We are seeing viable results already, although much more data is necessary - perhaps, petabytes of it, to even start such an endeavor. But with enough of it, the possibility is there.
But what exactly do I mean by singularity? Within this context, our aim is to develop an ecosystem that is so robust, that it's very own operation allows for our engine scale automatically - developing, analyzing and optimizing itself. A sandbox if you will. Such a feat would completely negate any necessity of human input to create. The possibility of uploading just one product, with the collective 'hive-mind' generating a completely unique shop instance, requiring no copywriting, oversight, or even thinking. With the recent partnership between ourselves and Ecompilot, it also means that one will only need a singular product as a base to get suggestions of dozens more related, or perhaps yet, nothing will be necessary to be input at all. And we already saw the possibility of it happening.
A shop is largely based on dynamic content, so it is a lot more difficult and resource intensive to develop than for example, one that solely uses static content - such an instance would essentially mean you are back in the early 90’s where no cart functionality exists. Dynamic content is the ability to constantly add, remove, update, and optimize for purposes of serving personalized information. We have already seen viable results with static pages, that do not require database functionality for constant data entry; something of complete necessity for managing products, creation of blog posts, review entries and allowance for users to create accounts to manage orders just to name a few.
Going back to the points made and examples given within 'Our Thesis'...
When we first introduced parts of the engine into our marketplace, we saw products by our sellers quickly rank first on search engines such as Google, Bing and so on. Outranking even those of Amazon and eBay, two of our largest competitors at the time. This saved us a lot of money on marketing, and made our vendors very happy.
We continued to expand on this model and the development of it, we started conducting more and more experiments, varied in nature, one of particular interest was the creation of full fledged websites that contained only static content. With the data we had from our marketplace, and using some of the most popular websites on the internet for shopping and extensive data aggregation, our engine could create full-fledged static websites with dozens of pages in mere minutes within set boundaries for satisfactory results, such as 'Shoe Related' and 'News Related'. All equally optimized, with copy as if it were written by some of the best copywriters. Copy which would quickly update to ensure highest possible organic ranking via constant comparison of similar websites on the internet.
The beauty does not end there, when I was younger I worked for a marketing company that would use third party data on weather, competitor positioning on search, and even commercials on TV to create automations to adjust bids on ads. For example, for a big retailer such as Amazon, stocking hundreds if not thousands of products at a time, in addition to third party sellers means having to run hundreds if not thousands of ad campaigns. Now, multiply this across various platforms, and channels. From PPC (Pay Per Click) to Display, from Google and Bing all the way to likes of Facebook and Twitter. The resource necessary to maintain so much and to ensure money is not wasted is truly a feat, impossible without immense human input and automation. Now add localization, language, currency and so many other variables and it would seem rather impossible to manage so much. This is where automation becomes a complete necessity.
Yes, you can introduce remarketing campaigns to off-set the prospecting spend, attribute post-impression tracking to see the effectiveness it had on a customer to make them decide to purchase something that otherwise would only provide data on events that happen directly from actions taken on that particular ad. To put it in simplest terms, smart attribution would tell the advertiser how many times you saw a particular ad on the Google Display Network, before you went and bought the advertised product after finding it on Search. This would give a deeper view on the effectiveness of certain campaigns even if they do not appear to be so at a first glance.
Now imagine if you could adjust your bidding strategy based on the weather; raining? Pause all ads that are promoting outdoor pool supplies and increase the bidding on raincoats and Wellington boots.
Did eBay show a commercial for Dyson vacuums at a particular time on TV in a particular area of the country? The next time it happens, Amazon in this example, will increase their bids automatically on terms relating to Dyson across their own campaigns in those particular areas.
This is something I helped develop, and something that will continue to expand on within our engine, to create a truly sophisticated, connected and self-evolving eco-system. Capturing weather data, search placements and continue to perfect localization.
Hopefully this provides understandable insight into our goals for the foreseeable. I did my best to make it as easy to digest as possible, as our aim will always be to oversimplify the complex.
We will not be introducing our engine to the public yet, as we need to ensure it is fully refined and perfectly working - for now, shop instances will be able to be created with ease, and with abundance of features from the get-go, such as allowing you to turn your shop into an app that can be installed. The data will be fed into our engine, obfuscated and encrypted, to ensure that is up to our standard.
A lot of the points touched on could change as we enter the market and listen to community input. We will also be introducing 'Enterprise' level features, for those who want to have complete level of customizability that is offered by our competitors at the near feature, this will however, not allow for our engine to do all the work for you.