Monthly Archives: January 2016

Behind every great retailer, there is a warehouse receiving, stocking, sorting, and distributing its products. These operations serve an invaluable service to these enterprises, but if one had to put a number on it, Supply Chain Digest estimated the total cost of logistics in the U.S. just under $1.5 trillion in 2014. Businesses spend significantly on their warehouses and logistics teams, trying to hone processes and accrue efficiency wherever possible, like building a better scheduling system so pickers can coordinate better with transportation or designing a more organized floor plan that limits excessive travel.

However, one lean processing enhancement retailers often balk at the opportunity to adopt is stockkeeping unit (SKU) rationalization, the act of reducing the number of different products they sell to streamline business all along the supply chain. According to research from the Association for Convenience and Fuel Retailing – formerly the National Association of Convenience Stores – the average retailer stocks between 2,500 and 3,000 SKUs. Paul A. Myerson, professor of Supply Chain Management at Lehigh University and contributor for Inbound Logistics, wrote grocery stores can easily hold up to 40,000 SKUs. Even removing a handful of these SKUs put less of a logistical strain on warehouses and creates space for newer, more profitable products.

How can retailers get the most out of SKU rationalization initiatives?

As far-reaching as these benefits of removing a few underachieving products could stretch, some retailers may still find it difficult to decide which items should be removed from shelves for the sake of optimization. What sorts of tools could retailers adopt to create the most cost-effective cuts?

The Pareto Principle
Distinguish between successes and failures using one time-tested method: the Pareto Principle.

Apply the Pareto Principle
Merchandise can only be profitable to retailers if it indeed turns a profit, if customers buy it, and a demand is evident. Otherwise, retailers waste money, time, and valuable shelf space stocking products no one wants. Worse still, they’ll need to pay for removal of these failing products as well.  If retailers could collaborate with their warehouse logistics teams, be they in-house or third-party, to create a reliable metric by which to separate products on the grounds of success, these enterprises could intelligently discern which products actually move through the supply chain and which ones simply crowd the aisles of their stores.

In doing so, retailers should keep the Pareto Principle – also known as the “80/20 Rule” – in the back of their minds. As this economics cornerstone dictates, 80 percent of effects come from only 20 percent of causes. In this case, 80 percent of sales only come from 20 percent of the items on warehouse shelves. Identifying and quarantining the highest fifth of SKUs within a given inventory can help inform retailers as they make their decisions as to what needs cutting. For example, if a brand with five overall SKUs has four in the top percentile, perhaps retailers would benefit from holding onto all five. Alternatively, if another brand only has one of its five SKUs in the most profitable echelon, perhaps this would make a good area for SKU rationalization.

“Tracking end-user purchasing activity can simplify SKU rationalization.”

Count on customer-focused cutbacks
All this aside, the most trying reason why retailers find SKU rationalization so difficult is the inherent element of customer retention. Should a retailer discontinue a product popular to a small, but loyal customer base, they risk losing its business permanently. So, when developing a plan for implementing SKU rationalization, forgetting to consider what the customer thinks would be foolish, for a number of reasons. Most importantly, tracking end-user purchasing activity can simplify the process even more.

For instance, a Marketelligent study found the success rate of SKUs segmented by color and style stayed relatively constant year after year, and any designs reactivated after being discontinued because of underperformance experienced no great gains or losses either way. In fact, on a list of SKU popularity rank from highest to lowest, the farther one descends, the steeper the climb to “long tail” logistics. This means mid to low-selling items suffer more from a greater number of options. By studying customer buying habits and their cumulative effect on sales, retailers and their warehouse teams can plot a course for effective SKU rationalization that reduces costs, complexity, and waste.

In the coming decades, the field of medical device manufacturing will be put to the test. According to the U.S. Census Bureau, more than 75 million baby boomers are reaching retirement age in the near future. Now more than ever, healthcare in general will need to find a functional way to accommodate so many late-in-life patients. And seeing as the number of millennials exceeds 83 million, a long-term solution is vital to the industry’s success. A system must be put in place that both adequately provides services to baby boomers today and is sustainable enough to at least lay the foundation for healthcare’s future. After all, the U.S. population is only getting bigger.

Medical device manufacturing will play a large, active role in how U.S. healthcare reinvents itself in the modern age, especially with the rise of the Internet of Things, a growing inter-operable network of wirelessly connected devices. A Goldman Sachs report on medical technology found devices like implantables and other forms of monitoring equipment have the power to lower healthcare costs for both providers and patients. Since cost stands among the single greatest concern for both parties, innovations in medical device manufacturing can be considered one of the most crucial drivers piloting this healthcare revolution.

That said, if medical device manufacturers cannot develop new equipment with quickness and efficiency, the medical industry as a whole loses a level of responsiveness it will desperately require during this transitory period. This industry can increase its efficiency by optimizing its research and development stages to help build an advanced infrastructure of digital awareness that will make low-cost, high-tech healthcare a reality?

“Manufacturers allocate more money to R&D, but generate fewer products

Never compromise on quality – reinforce it with process mapping
A study published in Nature Reviews Drug Discovery stated during the last six decades, the number of new treatments created for every $1 billion spent on research and development has declined by around 50 percent every nine years. In short, drug and medical device manufacturers allocate more money to R&D, but generate fewer products. Though this factor may indicate a need for efficiency improvements during a given product’s clinical trial stages, for the time being, manufacturers have no other choice but to find areas in R&D to cut for the sake of cost efficiency.

However, regulation from the Food and Drug Administration has only become more stringent in the last few decades with little sign of loosening. Reducing costs, therefore, should not jeopardize quality of the products under review.

Unconnected though it may seem, envisioning better layouts of a medical device manufacturer’s work cells on the production floor can increase efficiency by improving throughput and reducing resource costs, but potentially also improving a device’s quality. According to CIRTEC Medical Systems, O- and U-shaped production lines as well as intelligently placed tools and materials at every workstation increase efficiency by both reducing wasteful movement and action. Instead of hiring more employees or scaling up costly assets, manufacturers could instead strengthen R&D and standard operations through process mapping to gain a substantial efficiency margin.

Efficiency gains in one area often bolsters success in another. Streamlining R&D means manufacturers are able to move into the production stages faster. Reducing waste in manufacturing creates a greater pool of resources to encourage innovation in R&D

efficiency
Medical device manufacturers can integrate new technology into their operations

In-house tech takes center stage in R&D, innovation timeline
As we just touched on, rudimentary organizational upgrades can impact operational efficiency in extraordinary ways. Outside of the production floor, however, there are plenty of other opportunities to incorporate more consolidated, actionable lines of inter-departmental communication to that very end. But before the medical device manufacturing industry can accomplish those goals, companies within may need to overcome certain stigmas regarding the on-boarding of new internally facing technology.

A poll conducted by PricewaterhouseCoopers found executives in medical device manufacturing expect increases in innovation, but around 86% have yet to focus their directives on enhancing the processes directly related to fast-tracking new products from R&D through their releases to market. Moreover, of the executives surveyed, only 12% use advancements like mobile technology, data analytics, and cloud technology to “create new business models that center on clinical and consumer dynamics.”

While these technological additions to the world of medical device manufacturing cannot purport any innate efficiency improvements, their absence in an industry dedicated to the advancement of healthcare through state-of-the-art computerized machinery doesn’t make much sense. As the nature of patient care turns a corner, so too should the organizations leading the charge, which means laying the groundwork for more intelligent, responsive health services through the integration and development of dynamic digital assets that allow employees to communicate data comprehensively.

Forestry has been one of the most profitable industries in recent years, even with modifications made to integrate environmentally sustainable practices into the mix. The Southeastern Lumber Manufacturers Association stated U.S. forestry products, from wood to paper, generate as much as $200 billion every year. Moreover, replanting strategies have made U.S. forests today 25 percent denser than in the 20th century.

As lucrative as some business opportunities within this industry can be, the potential for inefficiency detracting from any potential prosperity is also quite high. Work in the U.S. lumber manufacturing in particular is extremely competitive. According to an estimate by the U.S. Lumber Coalition, $0.60 to $0.70 out of every dollar spent on timber remunerate manufacturers for the cost of production, not toward profit. As such, competition in lumber manufacturing can sometime work against all parties involved. However, finding ways to reduce materials costs or inefficiencies in internal processes may provide lumber companies a cushion with which to benefit from marginal decreases in final product pricing.

One such area ripe for optimization is asset utilization. How manufacturers use their tools and equipment ultimately decides the outcome of their enterprise. By scrutinizing the ways operators complete a given task and whether their actions align with the industry’s best practices, lumber yard managers can potentially eliminate waste, save resources, cut lead times, and increase the quality of the lumber they produce.

asset utilizatin
Optimizing asset utilization in mill machinery means cutting lumber smarter, to retain more of it.

Comprehensive automation throughout the entire lumber mill
From a general standpoint, automating as much of millwork as possible feasibly increases the likelihood of greater asset utilization. Research from North Carolina State University found that while reducing manual processes at the onset – like gang rip – could boost yield and curtail operational errors resulting in wasted wood, many procedures on-site still rely too heavily on employees who could be better utilized elsewhere.

For instance, if rough mills could integrate technology capable of assessing least-cost grade-mix – which directly correlates to the accuracy of a yield prediction – then communicate that data effectively to industrial sawing equipment without operator intervention, lumber manufacturers could erase a time- and labor-intensive step from the milling process. When lumber yard employees are expected to eyeball raw materials to determine their grade, this affords too much room for mistakes to occur. Faulty judgments produce waste and increase labor costs to make up for lost time. Automating as many aspects of lumber mill operations as possible increases asset utilization, and leaves nothing to chance.

Rethinking how debarking assets affect resources
It stands to reason that the more fiber retained after raw material has been debarked, the greater the possibility of creating ideal lumber products. Manufacturers of forestry goods that own conventional drum or closed cylinder debarking equipment, however, might find a number of efficiency upgrades by switching to cradle debarkers, according to the U.S. Department of Energy.

To start, open-top debarking systems provide operators with the freedom to debark different types and grades of wood to account for bark thickness. Closed systems shear back bark without taking these variations into consideration. In the end, this cuts back on how much tree fiber is lost during the debarking process, retaining more resources for the mill to turn into lumber.

“Cradle debarkers use 33% less energy than closed models.”

Additionally, as the DOE states, cradle debarkers use 33 percent less energy than closed models. They are also compact enough to integrate into lumber mill operations instead of setting up a separate debark facility elsewhere, effectively phasing out costly transportation costs between the two sites. In making one simple equipment change, saw mills could permanently optimize the milling process without reshaping much else.

Limiting lumber variation through modern data-driven initiatives
Determining the dimensions by which fiber is cut into lumber ultimately impacts both yield and how manufacturers operate saws and other equipment. That said, companies in the industry must remember to account for differential shrinkage during lumber drying, which could lead to resource hemorrhaging and could cost anywhere between $50,00 to as much as $250,000 annually depending on the mill in question, according to a study conducted by members of the Forest Products Society.

By abandoning traditional formulas for ascertaining optimal board thickness in favor of real-time statistical process control. Up-to-the-second data increases a lumber mill’s recovery rate for the fiber they cut into lumber, as it limits the time between resource assessment and the sawing process. In doing so, mills can increase asset utilization by paring down waste created when inaccurate drying predictions result in defective lumber products.

As the saying goes, “It is a poor craftsman who blames his tools.” The weight of this argument still rings true, but intelligently optimizing asset utilization supports mill equipment operators so they get the most out of the tools they use daily and, in turn, process lumber efficiently without generating unnecessary waste.

Pharmaceutical manufacturers can magnify operational efficiency by taking a closer look at on-site quality control processes.

Pharmaceutical manufacturing is all about innovation. The businesses within the industry work diligently to produce effective and accessible medications, as well as medical equipment for patients struggling with illnesses or discomfort. In this way, pharmaceuticals stand apart from other fields of manufacturing. Pharmaceutical can be ingested by customers or, as is the case with hypodermic needles or dialysis machines, connect invasively. As such, raw materials and finished products must be meticulously tested for quality to ensure patients receive healthy treatment doses, and drug manufacturers don’t accidentally introduce microbial pathogens into a patient’s already compromised immune system.

In finding new ways to increase operational efficiency in quality control testing labs for pharmaceutical manufacturers, the industry can enjoy advantages other industries do not – namely, the enhanced ability to save lives. However, these changes must be integrated carefully. Quality control professionals cannot make concessions on rigorously regulated testing assays. That said, finding methods for augmenting how the quality control process is performed – as opposed to the process itself – can yield powerful results in how pharmaceutical manufacturers retain valuable resources and expedite lead times, while still adhering to current good manufacturing practices (cGMP).

“Top performers in pharma had higher success rates with their medications.”

Understanding the value of communication
Before discussing a couple of efficiency measures the pharmaceutical industry could adopt to optimize performance in its clean rooms and testing labs, it is important to first touch on the true value of doing so. Drug companies take an incredible amount of time researching, testing, and producing their wares. A 2014 industry report from the International Federation of Pharmaceutical Manufacturers and Associations stated the time span between the start and end of a drug or vaccines research and development stages could be as much as 10 to 15 years per product. Even though these businesses want to develop as many treatment options as possible to generate a profit and help a widespread audience, they must adhere to all quality control standards, which can legitimately impact the rate of medical innovation.

Pharmaceutical manufacturers with similar time-efficiency issues could benefit from heightened attention to how the communicate inter-departmentally, or rather, how often these department communicate.  Customer demand informs manufacturers on how they should produce and what they should be producing to meet the needs of consumers. As a business within the pharmaceutical industry accelerates its production cycles or scales its market share, data inherently becomes more granular and, indirectly, more volatile if left unchecked. While drug companies toy with the idea of addressing deficiencies in their quality control labs, they should first structure how these departments communicate, especially the frequency with which QC professionals sit down for meetings with other department heads. Weekly conversations, even informal ones, can keep everyone – including the laboratory – abreast of major production crunches or fluctuations in volume that could impact testing. For instance, if an executive announces the future construction of a new clean room in which manufacturing workers can develop treatments, this ultimately means the quality control lab will have an extra area to monitor regularly.

quality controlHow can manual testing methods receive a performance boost with help from rapid microbiology?

Reducing costs through rapid microbiological methods
Cost reduction in pharmaceutical manufacturing does more than save businesses money – the action can have untold effects on U.S. healthcare. A study published in the Journal of Therapeutic Innovation and Regulatory Science revealed if drug companies decreased manufacturing costs by 30 percent, they could yield anywhere between $1 trillion and $12.3 trillion in social benefits to patients and medical science alike.

“Rapid microbiological equipment automates low-value laboratory labor.”

quality control testing is both a cost- and resource-intensive operation, principally so long as manual, growth-based assay remain the norm. However, the burgeoning field of rapid microbiology has been changing the ways quality control professionals function in a laboratory setting. As the name suggests, rapid microbiology seeks to increase the speed and efficacy with which quality control lab technicians perform their daily duties without altering time-tested compendial culture counting techniques. Additionally, rapid microbiological equipment automates certain low-value laboratory labor to free up technicians and centralize these processes into a single location.

To put rapid microbiological methods (RMM) into context, let’s focus on one specific laboratory activity: moving samples between incubators. Traditionally, lab workers would need to manually move the plates one by one at a predetermined time. Automated equipment could be programmed to execute this task with computer precision without much more than informal oversight.

According to a case study performed by Microbiology Consultants, LLC, a company that expended more than $5.2 million in conventional testing costs saw a nearly 90 percent reduction in operating costs the first year it switched to RMM. It is worth noting a significant portion of this reduction came from negating consumables and disposal costs entirely from the quality control process, thanks to innovative new approaches to sample preparation. Beyond the potential for return on investment (ROI), rapid microbial testing technology and strategies could insure against wasting valuable human capital, protect against contamination, and ultimately, shorten time to release while reinforcing quality control-related cGMP.

Integrating cost and resource efficiency into a pharmaceutical provider’s quality control model will require an unquestionable balance between eliminating waste and upholding best practices for the sake of its patients. Businesses who develop and integrate strong strategies for how to walk that line will benefit greatly.