Price Points by Omnia Retail

Here you can read more about Omnichannel Retail, Direct-to-Consumer Strategies and Retail Trends. Learn about the Implementation of Dynamic Pricing and Pricing Strategies.

How robotics and AI are improving supply chains

If only Henry Ford, the founder of Ford Motors who invented the assembly line that revolutionised how cars are made, could see how corporations have advanced the logistics of supply chains in 2022. Approximately 109...

If only Henry Ford, the founder of Ford Motors who invented the assembly line that revolutionised how cars are made, could see how corporations have advanced the logistics of supply chains in 2022. Approximately 109 years later, modern supply chains are including engineering and scientific developments like never before, seeing robotics and artificial intelligence (AI) brought to the forefront to increase productivity, decrease overheads, and improve the customer’s experience. With the rapid development of e-commerce and the changing landscape of consumer spending habits, it has become vital for retailers and brands in many industries to rethink and modernise how they bring a product through the process of production, manufacturing, shipment and delivery to the consumer. We have taken a look at how robotics and AI are affecting and improving global supply chains, the companies that run them and their employees. Robotics in retail and e-commerce One of the biggest issues within retail and e-commerce is delivery - and fast delivery at that. Using robotics, AI and automation (RAIA) has shown to significantly improve delivery schedules and times. According to a 2021 McKinsey & Company survey, 75% of retail supply chain leaders have made 2-day delivery a priority and 42% are aiming for same-day delivery in 2022. Alongside these consumer demands, 64% of retailers cited digitalisation and automation investments as being critical. A key area in speeding up deliveries and creating seamless supply chains is warehouse automation, and one such retailer that’s taken on the challenge is Ochama in the Netherlands. The Chinese e-commerce giant launched robotic grocery stores in four Dutch cities, namely Leiden, Amsterdam, Utrecht and Rotterdam. Groceries and non-food items are collected throughout the store by mobile robots, packaged and presented to shoppers. Customers can order their parcels via the Ochama app and then collect it at the store by scanning a barcode that specifies your order, after which a conveyor belt and robotic arms hand-deliver the order. This is only one part of the machine that has utilised robotics: A warehouse of 20,000-square metres is equipped with automated systems that can process up to 15,000 parcels a day. Because of the technological advancements used in the supply chain, Ochama has brought down the overall costs of food and non-food items by 10%, making the omni-channel retailer one of the first to be able to reduce some of the consumer living costs using robotics and AI. From groceries to clothing, AI is showing its exponential value. Finesse, a US clothing company, is using AI to determine future fashion trends for potential markets and moving away from fast fashion. When you visit their website, the clothing doesn’t actually exist yet. What you see is 3D-rendered items of clothing where shoppers can vote for an item they’d like to buy. The items with the most votes get made, resulting in reduced overstocking and lower production costs. In this case, we see AI being used as an integral part of the business model instead of being a background assistant to supply chain problems. AI has shown to be helpful with returns This particular business model, where votes determine production, doesn’t mean returns and refunds aren’t still an option - or a problem, depending which side of the e-commerce street you’re sitting on. Nonetheless, AI has also shown to be helpful with e-commerce’s biggest headache: Returns. Global e-commerce’s returns problem is estimated to cost €111 billion just over the festive season after December. Approximately 30% of all online orders are returned by customers, making it a very large and expensive problem. In fact, although a customer may experience the ease of “free and easy returns”, a typical return actually costs a retailer between €19 - €41 each time when they factor in transport, processing (receiving, inspecting, then sorting), and reselling efforts. Berkshire Grey found that processing time could be reduced by 25% and processing costs by 35% if employees could make use of automation and robots. How may employees view the incorporation of robotics and AI? A 2021 study published in the Journal of Technology in Behavioural Science conducted multiple interviews with employees of different seniority levels across multiple industries to quantify their understanding and perception of RAIA in the workplace. Firstly, the study found that employees feel that “human touch” and “soft skills” could never be replaced or replicated; secondly, it found that employees should view RAIA as an opportunity and not a threat; thirdly, employees may experience a job satisfaction dilemma; and lastly, employees feel that companies should be extremely prepared before and after RAIA is implemented for whatever the impact may be. There is no doubt that jobs, workplaces, employee-to-customer or employee-to-employee relationships will change, but it is important for companies and team members alike to start viewing RAIA as a way to upskill, revolutionise and grow. There seems to be a common misunderstanding that by including robotics and AI into the workplace it will automatically result in retrenchments, firings and an exodus of employees. Although we can’t speak for the intentions of all companies, robotics has shown in many cases to improve the work environment for employees. If employees have been spending valuable time on mundane or time-consuming tasks that are part of their job, they can now spend that time on strategy; the very thing that results in better productivity and more profit. With our fully or partially automated dynamic pricing software solutions, Omnia takes a similar stance. Users require less time on repetitive, high-volume tasks and have more time planning and managing the strategic direction of prices. Looking toward the future Warehousing, final assembly and production are three main areas where autonomous robots will be the most beneficial. Deloitte predicts that including robotics in these areas can increase productivity; improve the collection of data; and decrease the risk of hazardous tasks while working alongside humans for improved efficiency and safer work environments. McKinsey & Company conducted a study that surmised that 20-30% of the time can be freed up for other important tasks if repetitive tasks are automated or robotised. Deloitte suggests that using autonomous robots within the supply chain will dramatically increase over the next five years and the more companies start to incorporate robotics into their processes, the more fluid and seamless supply chains will become.

Doing ETL Process with Azure Technologies

Introduction Omnia processes millions of retailer’s products information each day to deliver the best prices according to their needs. To gain advantage from machine learning techniques Omnia began a new project to...

Introduction Omnia processes millions of retailer’s products information each day to deliver the best prices according to their needs. To gain advantage from machine learning techniques Omnia began a new project to backup and utilize its data to improve its business analytics and explore machine learning. The project has two main objectives: Backup data (about 38 GB per day, with expected growth in the coming months); Ability to create multiple ETL (Extract, transform, load) processes to make data available for diverse purposes a. This project ETL process is to transform unstructured data into business report data. Alongside that, we add the following technical requirements: Progress traceability and monitor the backup and the ETL; Facility to recover if something goes wrong in the process; Ability to scale the speed linearly to the amount of data; Flexibility to create multiple distinct ETL processes. As for constraints: Internet speed from our datacenter to Azure (about 50 MB/s); Couchbase version (3.0.0), cluster size and specs – to retrieve that for backup. Development The first step was to choose the technology stack and since Omnia is primarily a .NET shop we investigated Azure. This seemed reasonable due to the current offers and easy team adoption. To satisfy the backup requirement, Azure Data Lake Store (ADLS) seemed the appropriate choice, due to the facts: Unlimited storage capability; Backups with redundancy on the fly; Created based on HDFS and supports big files (talking hundreds of gigs). Omnia currently has its infrastructure designed around Hangfire to allow easy scheduling and monitoring of background jobs, so the backup to ADLS was created as another Hangfire job. We used the Dataflow library to be able to have a good degree of parallelism when inserting documents into ADLS. We decided to zip them (with a 4 MB size cap) because text documents have a good compression rate, we use less bandwidth between our datacenter and azure and file sizes have an impact of Azure Function processing time (this is constraint since Azure functions have short lifespans). For the ability to create multiple ETL processes with ease, Azure Functions seemed the best fit due to their serverless characteristic. Azure Functions are implementations of serverless architecture. The serverless architecture allows one to execute a business logic function without thinking about the underlying infrastructure, i.e., the functions run and live in stateless compute containers that are event-triggered. Moreover, Azure functions scale on demand, have automatic failover and business logic written in C# which promotes flexibility for business requirements. To trigger Azure functions, we selected Azure Storage Queue as it provides a reliable way to transmit information between applications. These characteristics met the non-functional requirements of scaling and availability to allow the transformation of hundreds of gigabytes of Omnia data. As for the business reports requirement, Azure has SQL Data warehouse (SQL DW) seemed a possible fit due to: Ability to manage large volumes (terabytes) of information; It distributes data across 60 databases (by default) and enables more partitions; Has column store indexes – enabling fast returns for every database column; T-SQL has aggregation function to summarize data; Ability to connect with Azure Analysis services; Combability with other Azure components is easy to establish - SQL DW also has a tool to connect ADLS called PolyBase. PolyBase parses files from ADLS and map them into SQL tables. Complete Process The process to perform the backup and the ETL process is represented by the sequence diagram in Figure 1. Figure 1 Backup and ETL (simplified process) The process starts with a Hangfire job batch that inserts data into ADLS. Every time a file is added onto ADLS, a message is inserted into unzip message queue and the Unzip function is triggered. After the file is extracted into ADLS another message is inserted on transform message queue to trigger the Transform function that splits the file into several CSV files. When all files are converted to CSV, PolyBase is then manually triggered to insert data into SQL DW. This process has a central information point, ADLS, and a good scalability due to the use of Azure functions. Traceability is achieved by using the paths as the file name of the file. An important aspect to organize the ETL’s information was folder structure. Each ETL phase has a specific folder identified by a date folder structure and the file unique name. The folder structure is: Backups Year/Month/Day/ Unzip Etl/Unzip/Year/Month/Day/Guid_N_Total.json Final Files Etl/Csv/Year/Month/Day/BusinessFolder/Guid_N_Total.csv Etl/Csv/Year/Month/Day/BusinessFolder2/Guid_N_Total.csv ... To monitor all these functions, we used Azure’s application insights and found it easy to understand if something went wrong. Although, a single monitoring system needs to be set up to enable an overall monitor of the three different resources, and an automatic verification system to assure if all the backup files were transformed and inserted. Tips from our development process Don’t build Azure functions with more than one objective. Keep them short and precise, otherwise, they will fail and terminate midway through; Keep in mind that if the same message might be caught by multiple azure functions and your process must predict that same behaviour; Think about the process with monitoring in mind and the ability to execute specific parts; Have a correlation identifier to represent the entire ETL process. That will allow you to aggregate all the data regarding that specific run and monitor it; Use PolyBase to upload data to SQL DW instead of SISS. PolyBase was made to work with ADLS, and Microsoft as a lot of documentation on how to use it; When adding data to SQL DW, use CTAS (Create table as select) – it makes the data insertion faster and you can it to merge two tables without duplicates; To fully achieve the full SQL DW potential you need at least 60 million rows evenly distributed across all the 60 databases that it provides. Be aware of it, when developing your solutions. When creating SQL DW database study the data carefully; think how you are going to use, how you are going to partition and distributed across the database. Conclusion This project is still underway, but so far, we are happy with the results achieved. This process allowed us to backup and transform, 38 GB - about 10 Million Couchbase documents, in 1 hour and 40 minutes. Our prominent bottleneck lies on our on-premise infrastructure where hang fire batch jobs run. However, this runtime suits fits our current need, and it can be scalable by increasing the number of job slots available to perform the batch. The Azure functions occurred in parallel with the backup and that time is encapsulated by our thought-put into ADLS. PolyBase’s ingestion time to SQL DW is another project itself because it’s related with other factors such as the process units chosen for the database, the data model designed and the amount of information already stored. The next steps will be achieving a way to monitor all 3 components and assure that all backed-up data was indeed transformed and automate the PolyBase ingestion.

Sign up to be the first to get information from Omnia.

Sign up now