Big data is used for describing a large amount of data, structured and unstructured, that represents a business on the day-to-day basis. The real advantage of big data does not lies under the amount of data rather it lies in the process of using those data for organization’s benefit (Hazen et al. 2015). Strategic business move and better decisions can be taken by the management through analyzing the big data. By this definition, Big Data as a concept requires three distinct layers before application: more data, processing systems, and analytics. If Big Data only recently entered the supply chain management spotlight, then, it may be because the technology only recently reached the last layer to deliver insights.
The data collection is a crucial process in big data. As the amount of data is huge, it is significant to apply the proper process for catching all the business data. The data collection is only the mere start of using the big data for gaining business intelligence. In order to make use of that for a longer period of time, storing the data is a crucial step (Wu et al. 2014). The consumer-centric product is the key for developing a great innovative product. Determining the requirements of the consumers is the initial and most crucial stage in developing the consumer-centric product.
This report consist of the topics such as data collection and storage, data in action and business continuity. The data collection and storage topic entails the description of the systems that collects and stores data. In the data in action section, the consumer-centric product design entity has been described as well as information regarding recommendation system has been included in this part. In the final section, the solutions for maintaining online business in case of power outage and other disasters has been stated.
Data Collection and Storage:
Data Collection System:
The data collection system of an organization in the supply chain management will be aggregating and evaluating collection of information in a continuous and efficient way. A organization should implement a modern data collection system in the SCM so that huge amount of data can be taken as input into the system using the advanced technology (Jagadish et al. 2014). In order to parse and analyze the data, advanced data collection system are more effective.
Type of Data
1. Continuous inventory at various locations (more than one)
2. The data will be collected at a more individual level such as size, style and color
The data will be collected from hourly updates to monthly updates
Inventory at stores, various online vendors, warehouse and internet sources
1. Existence detail around sale of the organization’s products, quantity, sold items, customer data, price, and time of day as well as date
From hourly and daily to weekly to monthly updates
Sales of distributers, international sales, direct sales, internet sales and competitor sales
Location and Time
In order to detect the location of the store using sensor data, within distribution center, sensor data for misplaced inventory and transportation unit
The database will be updated frequently for fresh movements and locations
1. The data will provide an idea regarding not only its position but also what is near to its position, the person who moved it, forecasted path forward and the path for getting to the forecasted region or place
2. Location positions will be time stamped using mobile devices
1. In order to have better details for purchasing and decision behavior, this type of data will be useful
2. The purchasing and decision behavior related data will be such as bought items, timing, browse items, frequency and dollar value
From the click to the usage of card
Face profiling information regarding shopper
emotion detection and recognition;
sentiment regarding products bought
on the basis of “Likes,” and “Tweets,”, eye-tracking data and
Table 1: Types of Data in Supply Chain
(Source: Waller and Fawcett 2013)
Collecting Big Data using Body Area Network: In Body Area Network (BAN), a few sensor hubs are put on a human subject to gather diverse body crucial signs information, similar to, heart rate, pulse, diabetes, ECG, breathing rate, mugginess, temperature, development, course, nearness, and so forth. The sensor hubs on the human subject can frame a star topology. In star topology, the sink hub gathers the detecting information from the all body sensors (Quwaider and Jararweh 2015). At that point, the gathered information is totaled into one parcel to lessen the cost and the interchanges in the system. By means of Bluetooth, the totaled parcel is sent to a Personal Digital Assistant (PDA) or an advanced cell for BAN observing application. At that point, a web-benefit module is utilized to transfer the Internet servers with the watched information utilizing either Wi-Fi or cell innovation such as 3G or LTE for information correspondence.
Nanophotonics-enabled optical storage techniques will be used for storing the big data of an organization’s supply chain and operations. Progressive developments away systems are sought after to change ordinary GB optical circles into ultrahigh-limit stockpiling media. In late decades, nanophotonics has developed as a quickly growing new field of light–matter cooperation, generally attributable to late advances in nanotechnology that allow better control of material properties at the nanometer scale and additionally the accessibility of modern nanophotonic tests.
Features of the Storage System: The features of the Nanophotonics-enabled optical storage are ultrahigh density, multidimensional storage, 3-D super resolution recording and ultra high throughput.
Ultrahigh Density: Late advances in nanophotonics can encourage either the encoding of data in physical measurements, for example, those characterized by the recurrence and polarization parameters of the composition shaft, or the accomplishment of three-dimensional (3D) superresolution recording, breaking the traditional stockpiling limit confine (Gu, Li and Cao 2014).
Multidimensional Storage: Nanophotonics takes into account sharp shading and polarization selectivity in light–matter communications at the nanometer scale. For instance, light can couple to surface plasmons, the aggregate motions of free electrons in metallic nanoparticles, which yield deterministic ghostly reactions related with their sizes and shapes. These engaging properties make nanoparticles reasonable for the usage of frightfully encoded memory. Thusly, in light of the guideline of light-actuated shape advances, three-shading phantom encoding by utilizing gold nanorods of different sizes has been illustrated.
3-D super Resolution Recording: An assortment of strategies have been proposed and shown to soften the diffraction hindrance up the close field locale and accomplish super-settled areal recollections. Be that as it may, these methodologies don't display the capacity to record data in the volume of a medium. As of late, motivated by a diffraction-boundless far-field imaging approach, researchers have grown super-determination photoinduction-restraint nanolithography, which can break the diffraction boundary and accomplish 3D super-settled written work (He et al. 2016).
- Ultrahigh Throughput: Different optical parallelism techniques for creating multifocal exhibits in single laser shots have been proposed and tentatively illustrated, including miniaturized scale focal point clusters, diffractive optical components, Debye-based Fourier change and dynamic PC produced visualizations. Among these techniques, Debye-based Fourier change empowers the development of diffraction-constrained multifocal spots utilizing a high-NA objective wherein each central spot can be progressively programmable, which is a need for ultrahigh-thickness optical recording (Gu, Li and Cao 2014.
Data in Action:
Customer Centric Product Design:
Customer-centric is an approach for carrying out business that concentrates on giving a positive customer experience at the time of sale and after sale for driving profit as well as acquiring competitive benefit. Concentrate on the Right Customers for Strategic Advantage, clarifies that in light of the fact that not all clients end up being beneficial, organizations that try to be client driven and increase key favorable position ought to recognize the best clients and concentrate on building their items and administrations around the requirements of those particular people (Seybold 2014). This is accomplished by social affair client information from different channels and investigating it to better comprehend and order clients.
One approach to make sense of if a client is high caliber, as per Fader, is to ascertain their client lifetime esteem (CLV), which predicts the net benefit a business will obtain from its whole future association with a client. Top notch clients are the individuals who remain faithful to the organization and don't leave unless given an exceptionally solid motivator to do as such. These clients have a high CLV and altogether have a low steady loss rate (Kolko 2015). Client driven associations endeavor to obtain, hold and build up this sort of client by improving their experience.
Four steps can be carried out for implementing customer centric product design with the organization. The four steps are as following.
Listening: The organization must listen often, early and differently for understanding the perspective of the customers. Involving the prospective consumers within the ideation and innovation procedure is the key of collecting data. Listening will be done continuously at the time of product development procedure (Verhoef and Lemon 2013). This will provide the organization an idea regarding what needs to be changed in the product or what need to be developed. As much as the organization will listen to the consumers, the design process quality will improve. There are various methods that organization can utilize for gathering data from the consumers such as questionnaire, interview, brainstorming and many more.
Asking Questions: The questions that the organization will be asking to the consumers can have an impact on the quality of the data collection process. It is highly recommended to the organization that the right questions must be asked to the consumers. The required data can be gathered if the queries are asked properly (Chuang, Morgan and Robson 2015).
Collecting Deep Data: The customer-centric product design is a data-driven process. The organization must analyze the gathered data properly so that crucial information can be revealed (Verhoef and Lemon 2013).
Invest: The organization must invest in the product related data gathering process. In order to acquire each and every data required to develop the customer-centric product, all the possible methods adequate for the process must be considered.
Recommendation systems use a number of different technologies. The study classifies these systems into two broad groups.
Content-based frameworks look at properties of the things prescribed. For example, if a Netflix client has viewed numerous cowhand motion pictures, at that point prescribe a motion picture grouped in the database as having the "cowpoke" class (Romero, Constantinides and Brunink 2014).
Synergistic sifting frameworks prescribe things in light of likeness measures amongst clients as well as things. The things prescribed to a client are those favored by comparative clients. Be that as it may, these innovations without anyone else are not adequate, and there are some new calculations that have demonstrated viable for proposal frameworks.
Application of Recommendation System in SCM and Operations Management: Maybe the most imperative utilization of suggestion frameworks is at on-line retailers. We have noticed how Amazon or comparable on-line sellers endeavor to give each returning client a few proposals of items that they may get a kick out of the chance to purchase. These proposals are not irregular, but rather depend on the buying choices made by comparative clients or on different procedures we should examine in this part.
The following approaches can be considered for ensuring business continuity at the time of power outages.
Using Backup Generator: The first and foremost thing to look into is a backup generator. These big hulking boxes of electricity usually sit outside your business walls and get really loud when they are on. May be these backup generators need not be used ever but it provides more possibility in business continuity (Chang 2015).
Disaster Recovery Plan: Regardless industry or size, when an unexpected occasion happens and makes day operations to stop, an organization should recoup as fast as conceivable to guarantee you will keep giving administrations to customers and clients (Sahebjamnia, Torabi and Mansouri 2015). Downtime is one of the greatest IT costs that any business can confront. In light of 2015 calamity recuperation measurements, downtime that goes on for one hour can cost little organizations as much as $8,000, medium size associations $74,000, and $700,000 for vast ventures. For SMBs especially, any broadened loss of profitability can prompt decreased income through late invoicing, lost requests, expanded work costs as staff work additional hours to recuperate from the downtime, missed conveyance dates, et cetera. On the off chance that significant business interruptions are not foreseen and tended to today, it is exceptionally conceivable that these negative outcomes coming about because of a sudden debacle can have long haul suggestions that influence an organization for a considerable length of time (Sahebjamnia, Torabi and Mansouri 2015). By having a Disaster Recovery design set up, an organization can spare itself from various dangers including out of spending costs, notoriety misfortune, information misfortune, and the negative effect on customers and clients.
From the above study, it can be concluded that the big data can used for gaining business intelligence in the supply chain management and operations management. Although “Big data” has become a contemporary buzzword, it has significant implications in our discipline, and presents an opportunity and a challenge to our approach of research and teaching. We can easily see how data science and predictive analytics apply to SCM, but sometimes find it more difficult to see the direct connection of big data to SCM and operations management. Through implementing the recommendation system, an organization can establish better engagement with the consumer that is crucial for supply chain management. The control over the inventory and retailing can be enhanced through the use of recommendation system.
The purpose of the study is to provide a timely assessment of the field and motivate additional research and pedagogical developments in this domain. As was illustrated, the field of SCM predictive analytics provides a promising avenue for transforming the management of supply chains, and offers an exciting array of research opportunities. Optimizing supply chain with big data is essential for all the organizations. As the challenge regarding the hike in the operations and supply chain data is increasing continuously, it is becoming issue for the organization for maintaining the data. If the big data is being used properly for generating business intelligence, then it can allow the organizations to capture, store and analyze the massive amount of data properly.
The report has able to provide descriptive information regarding various factors that can have impact in the supply chain management through using big data. The study failed to link the data collection system and recommendation system.
Understating the Scale, Scope and Depth of Data: In order to drive the contextual intelligence, the supply chain management through understating the scope, depth and scale of data is using sufficient data sets. All total fifty-two big data needs to be collected by the supply chain management.
Making Supply Networks more Complex: Empowering more mind boggling provider organizes that attention on learning sharing and joint effort as the esteem include over simply finishing exchanges. Huge information is altering how provider systems shape, develop, multiply into new markets and develop after some time. Exchanges aren't the main objective, making learning sharing systems, depends on the bits of knowledge picked up from huge information examination.
Integrating Big Data and Advanced Analytics: Huge information and progressed investigation are being coordinated into enhancement instruments, request anticipating, incorporated business arranging and provider joint effort and hazard examination at a reviving pace. Control tower investigation and representation are additionally on the guides of store network groups as of now running huge information pilots.
Considering Big Data Analytics and Important Part: Sixty-four percent of inventory network administrators consider enormous information examination a problematic and essential innovation, setting the establishment for long haul change administration in their associations.
Utilization of Geo-analytics: Utilizing geo-investigation in light of enormous information to combine and streamline conveyance systems. One of the cases gave is the manner by which the merger of two conveyance systems was arranged and advanced utilizing geo-examination. Joining geoanalytics and huge informational collections could definitely lessen satellite TV tech hold up times and driving up benefit exactness, settling a standout amongst the most understood administration difficulties of organizations in that business.
Recognizing Impact of Big Data: Huge information is affecting associations' response time to inventory network issues that is forty-one percent, expanded production network proficiency of 10% or more noteworthy that is thirty-six percent, and more prominent coordination over the store network, thirty-six percent.
Implanting Big Data Analytics within the Operations: Implanting enormous information investigation in operations prompts a 4.25x change so as to cycle conveyance times, and a 2.6x change in store network productivity of 10% or more noteworthy.
Understanding the Impact of Big Data in Organization’s Finance: More prominent logical insight of how inventory network strategies, procedures and operations are affecting money related targets. Store network perceivability regularly alludes to having the capacity to see different provider layers profound into a supply organize. It's been my experience that having the capacity to track budgetary results of store network choices back to monetary targets is feasible, and with huge information application coordination to money related frameworks, extremely successful in businesses with fast stock turns
Enhancing Supplier Quality: Expanding provider quality from provider review to inbound investigation and last get together with enormous information. The association can build up a quality early-cautioning framework that distinguishes and afterward characterizes a prioritization structure that confines quality issue speedier than more conventional strategies, including Statistical Process Control (SPC). The early-cautioning framework is conveyed upstream of providers and reaches out to items in the field.
Chang, V., 2015. Towards a Big Data system disaster recovery in a Private Cloud. Ad Hoc Networks, 35, pp.65-82.
Chuang, F.M., Morgan, R.E. and Robson, M.J., 2015. Customer and competitor insights, new product development competence, and new product creativity: differential, integrative, and substitution effects. Journal of Product Innovation Management, 32(2), pp.175-182.
Gu, M., Li, X. and Cao, Y., 2014. Optical storage arrays: a perspective for future big data storage. Light: Science and Applications, 3(5), p.e177.
Hazen, B.T., Boone, C.A., Ezell, J.D. and Jones-Farmer, L.A., 2014. Data quality for data science, predictive analytics, and big data in supply chain management: An introduction to the problem and suggestions for research and applications. International Journal of Production Economics, 154, pp.72-80.
He, Z., Wang, X., Xu, W., Zhou, Y., Sheng, Y., Rong, Y., Smith, J.M. and Warner, J.H., 2016. Revealing defect-state photoluminescence in monolayer WS2 by cryogenic laser processing. ACS nano, 10(6), pp.5847-5855.
Jagadish, H.V., Gehrke, J., Labrinidis, A., Papakonstantinou, Y., Patel, J.M., Ramakrishnan, R. and Shahabi, C., 2014. Big data and its technical challenges. Communications of the ACM, 57(7), pp.86-94.
Kolko, J., 2015. Design thinking comes of age. Harvard Business Review, 93(9), pp.66-71.
Lorenzo-Romero, C., Constantinides, E. and Br?nink, L.A., 2014. Co-creation: Customer integration in social media based product and service development. Procedia-Social and Behavioral Sciences, 148, pp.383-396.
Quwaider, M. and Jararweh, Y., 2015. Cloudlet-based efficient data collection in wireless body area networks. Simulation Modelling Practice and Theory, 50, pp.57-71.
Sahebjamnia, N., Torabi, S.A. and Mansouri, S.A., 2015. Integrated business continuity and disaster recovery planning: Towards organizational resilience. European Journal of Operational Research, 242(1), pp.261-273.
Sahebjamnia, N., Torabi, S.A. and Mansouri, S.A., 2015. Integrated business continuity and disaster recovery planning: Towards organizational resilience. European Journal of Operational Research, 242(1), pp.261-273.
Seybold, P.B., 2014. Outside innovation. HarperCollins e-books.
Verhoef, P.C. and Lemon, K.N., 2013. Successful customer value management: Key lessons and emerging trends. European Management Journal, 31(1), pp.1-15.
Waller, M.A. and Fawcett, S.E., 2013. Data science, predictive analytics, and big data: a revolution that will transform supply chain design and management. Journal of Business Logistics, 34(2), pp.77-84.
Wu, X., Zhu, X., Wu, G.Q. and Ding, W., 2014. Data mining with big data. IEEE transactions on knowledge and data engineering, 26(1), pp.97-107.