Monthly Archives: August 2016

What areas should transportation divisions look to that could both hide wasteful processes and, if maintained properly, significantly optimize processes?

Transporting goods, as a subdivision of a company’s logistics department, has transformed from an ancillary service to one of the most valuable resources in modern commerce.

Though the value of transportation logistics varies across different supply chains and might be difficult to ascertain cumulatively, a William Blair & Company study found third-party logistics companies constitute a more than $500 billion global venture with only 10 percent of the logistics market share. As evidenced, the power of an intelligently managed fleet can yield exciting revenue growth and truly define a business at its core.

That said, operational deficiencies can also turn the best logistics providers on their proverbial ears. Supply chain networks subsist on the fast execution or pertinent tasks and abhor waste in all forms. Depending on where it occurs along the supply chain, even the smallest inconsistency can offset processes down the line and cause subsequent harm, costing several companies considerable revenue all at once.

Best practices always encourage logistics providers to implement verifiable methods of improving operations when it comes to transportation management. In many ways, this is easier said than done. However, what areas should transportation divisions look to that could both hide wasteful processes and, if maintained properly, significantly optimize processes?

Reliance on technology necessitates repair management
As more and more advanced equipment make their way into all levels of the supply chain, manufacturers expand their potential for accelerated growth and enhanced productivity in granular increments to complement positive scalability. However, once any process becomes heavily dependent on technology, it can just as easily fall victim to its limitations. Machinery doesn’t last forever, and eventually businesses utilizing advanced equipment will need to dip into their repair and maintenance allocations. That is, after all, total cost of ownership.

transportation logistics

Big trucks cost big bucks to maintain.

Unfortunately, these R&M costs are not fixed and will require foresight to avoid exorbitant fees. According to a study conducted by the American Transportation Research Institute, the average R&M costs per mile for a single truck grew by 7 percent between 2012 and 2013. When multiplied over an entire fleet and compounded by overall industry growth with expanded services, these operational costs truly stack up. Moreover, the ATRI study also stated fluctuating petroleum prices can impact the cost of tires, an integral and unavoidable resource for ground travel.

On top of all that, these costs don’t even consider the most important ones of all: downtime-related revenue losses. This multifaceted issue affects all aspects of business performance and can ruin profitability across all. Downtime loses companies customers, stifles on-site and off-site productivity, leads to expensive overtime pay for workers, hurts a brand’s image and could possibly even lead to expensive litigation. Though trying to assign a set number value to downtime casts a wide net, Gartner places the figure around $5,600 per minute – or more than $300,000 an hour – on average.

Companies who take their R&M schedule into their own hands can undermine these risks. Logistics providers should be sure to develop strategies that log relevant data pertaining to things like vehicle wear and tear. This can allow logistics managers to address necessary maintenance before the equipment breaks down and the serious costs start piling up. Moreover, this regiment creates an operational framework that accounts for repairs rather than responds to them as best it can.

Closing the skills gap
To maintain a competitive edge, the transportation industry requires a competent workforce knowledgeable about on-site operations, as well as overarching trends in the field. However, many problems stand in opposition of this, ultimately stemming from negative public opinion about the transportation industry and those who are needed to assume its mantle in the coming years.

For example, a study by the Economic Modeling Specialists International found more than half of all truck drivers in Houston, Texas, are 45 years old or older. This is not an isolated issue. Given the size of the baby-boom generation, as these workers retire, others will need to fill the void. Not anyone will do – as stated earlier, transportation has undergone a digital age makeover as of late. With Big Data and the Internet of Things redefining logistics operations, new hires must possess ever-increasing technological knowledge. Ironically, many young people in search of their first careers have been and will continue to be swayed by the incorrect notion that logistics and manufacturing are strictly about manual labor.

To ensure any enhancements to long-term productivity, transportation and logistics providers should never underestimate the value of public relations. Additionally, these organizations should develop new training tools that facilitate an accelerated process that doesn’t glance over the necessities, but provide job candidates with resources that complement their tech-savvy lives and minimize downtime between incoming and outgoing employees.

Data transparency can help food and beverage manufacturers cut out waste and feed the world.

Food and beverage manufacturers are tasked with a single goal: to convert raw materials into edible meals and snacks. Though this might seem like a sterilized approach to an industry that farms out fresh vegetables or churns out chocolate chip ice cream, from the perspective of those in the field, this point of view helps focus attention onto certain components within a given process and isolate issues compromising efficiency. A certain number of ingredients are retrieved for processing and a certain number of groceries are the product of those ingredients. Pretty simple, right?

However, the wingspan of the food and beverage sector is far-reaching, and it is arguably the most complex modern industry. In 2007, Forbes estimated its value between $1.6 and $4.8 trillion globally, but published these figures with great reservation given how difficult it is to actually account for an industry as massive as the one that feeds nearly everybody on Earth.

This problem is indicative of a significant issue in food processing microcosmically, that of transparency. Just as no one can really tell how widespread the scope of the food and beverage industry really is, many food processors individually do not have the resources necessary to track their operational efficiency or analyze their throughput in a manner that promotes positive, incremental growth and improvement, known in lean manufacturing circles as “kaizen.”

That said, any food processing schema can be divided into at least two major divisions: raw materials procurement and processing. Increasing data transparency in these target areas, along with expanding the capabilities of that data, can help manufacturers make more comprehensive decisions about how best to upgrade on-site efficiency.

picking apple

Yield management lets food processors track their products from farm to plate.

Reducing raw materials waste with more comprehensive yield management
As we mentioned earlier, thinking about food processing linearly creates a simplified lens through which industry experts can perceive where they excel and where they fall short. To that end, greater attention paid to yield management could provide the food and beverage industry with a picture of just how efficient their workflow is at present, and in the long run, whether changes to operations push the needle further into the black or the red.

In brief, yield management concerns itself with how effectively a given manufacturer processes raw materials into a set number of products. Though it is nearly impossible to achieve a perfect balance between what a business consumes and what it produces – creating no waste in the process – allocating resources toward investments and practices that can determine and track a yield index can do more for companies than simply point out the unavoidable. One Wageningen University study found the yield index, once honed, can also work in reverse, providing manufacturers with a fuller idea of their processing capabilities based on the amount of raw materials they collect.

All this aside, taking advantage of better yield management deployments can only ever be the tip of the iceberg when improving operational efficiency for any food and beverage manufacturer. Though simplifying process industries can be a handy technique for starting to solve a given problem, the truth is this field is far too multifaceted to rely on only surface assessments. Many slight factors, known as yield and mix variances, can dramatically skew yield efficiency figures. For instance, Infor reported fluctuations in the fat content of chickens can ultimately reshape production volumes for the poultry industry and, in so doing, the ratio of production against raw materials. In the end, data gleaned from yield management strategies can only do so much for guiding efficiency measures, but it is undeniably fundamental to improving operations because it is a necessary metric through which to gauge process changes.

“Asset utilization plays a key role in preventing wasteful actions from impacting yield.”

Onboarding detection technology to perform proactive maintenance on processing assets
How efficiently a company operates also depends on how its supervisors, managers, and employees handle the on-site machinery that process its raw materials into food. Asset utilization plays a key role in preventing wasteful actions from impacting yield. However, if these businesses cannot rely on their large-scale assets to function properly, how can they be expected to measure their efficiency with any accuracy?

Proactive maintenance solutions involved a higher degree of asset monitoring. By investigating small variations in things like the heat or speed of a component or subassembly, maintenance personnel and the companies they work for forgo long downtime and can instead schedule off-time to address minor tweaks and repairs. As Packaging World explained, broken equipment in any process industry can sacrifice significant uptime, but the importance of proactive maintenance is more about control through technology than anything else. In the world of food and beverage processing, a system error can also lead to enormous losses in raw materials should equipment fail mid-shift, jeopardizing production quotas. By outfitting capital-intensive assets with cutting-edge, low-cost monitoring equipment, and funneling the data through a centralized maintenance management system, food manufacturers can learn more about the machines they use everyday, and have greater control over when and how they go offline.

Six Sigma hates waste. If a manufacturing plant plans on getting any leaner, its executives, managers, and supervisors should be on the prowl for areas to streamline workflow and cut out unnecessary steps.

Depending on the industry in question, inventory may play a vital role in on-site operations. Whether inventory is used for holding spare parts for the company’s most important assets, managing SKUs awaiting completion or transportation, or storing raw materials integral to the manufacturing process, inventory management can be pivotal to efficiency.

Inventory can also, however, become a breeding ground for waste, and the stresses of an unkempt inventory can permeate throughout disparate operations, creating backlog, downtime, and unmet expectations. Let’s take a look at a few examples of how an inventory without proper curation can lead to all different kinds of waste.

Before we go any further
Though it may seem rather obvious, inventory management really only impacts industries that actually maintain a comprehensive inventory. With the rise of just-in-time supply services and kitting, many inventories across a number of different industries have been thoroughly rationalized to the point where harping on them any further would be, in a word, pointless. Sure, organization throughout a company is an ideal worth promoting, but further investing time, money, and energy into an inventory after taking certain measures will not provide an ample return.

According to a study on the link between operational performance and inventory management published in the Journal of Operations Management, only two-thirds of industries investigated experienced substantial gains with a leaner inventory. As the authors of the study suggest, understanding the role inventory plays in an individual business and measuring how deeply ingrained it is in ancillary processes. Otherwise, companies attempting to adopt a leaner mindset will expend resources unnecessarily, resources better spent on making actionable changes in other areas where they’re needed. Managing inventory waste for some will merely be a surface adjustment, but for others, it can clue them into subcutaneous operational deficiencies that can return serious value if unearthed.

Waste in waiting
In an interview with IndustryWeek, Frank Hill, director of manufacturing business development for Stratus Technologies, said 3 out of 10 manufacturers run into unplanned downtime, the number of large downtime events are on the rise, and each one costs roughly $17,000 to navigate. With so much at stake, it’s no wonder why manufacturers push to keep their assets continuously operational.

That said, downtime for many in the manufacturing sector only pertains to failed assets or big-time disruptions. That isn’t always the case. Downtime can also include any time spent not performing the value-added tasks at hand. Machinery can be fully operational, employees can be at the ready, but downtime can still occur. Rooting out the culprit might lead lean teams right to inventory.

Builds or repairs might require components held in inventory, which means employees tasked with these duties will need to locate them as fast as possible to complete their respective jobs. Without a regimented, intuitive system for hunting down these things, workers waste time between tasks. While it may only be a matter of minutes, these gaps in performance add up incrementally. To prevent these undercurrents of waste from siphoning successes, ask: What systems or procedures would a manufacturer have to enact to ensure its employees find what they need exactly when they need it?

inventory waste

Moreover, in the event of a large-scale downtime event, what sorts of low-tech, high-value activities can employees perform until they can return to their work? Manufacturers should develop fail-safe measures against wasted opportunities. In fact, so long as it integrates well with a company’s downtime objectives, inventory management – like cleaning, organizing, and shelving – can be something employees attend to during the wait to return value to the company during a downtime deficit.

Waste in damaged materials
Shrinkage is like a monster with many heads – a single swipe of a sword might not be enough to take the beast down. However, everything has a weakness, and for shrinkage, it may reside in catalysts entrenched in peripheral operations and processes.

Root cause analysis might confirm shrinkage directly correlates with administrative shortcomings or underutilized metrics during the procurement stage. For instance, if a manufacturer’s inventory is lousy with a specific type of fastener, it could be because orders for that component don’t legitimately reflect updated supply/demand data. As such, administration has been placing orders for parts employees don’t need and already have too many of. Additionally, supplier-manufacturer relations could also play a role in waste. If a particular component is rare or difficult to produce, suppliers may pigeonhole manufacturers into procuring more than they need. Without an effective system in place auditing and monitoring said inventory, manufacturers could be overspending upfront.

So how can manufacturers determine if their shrinkage is bad enough to devote serious resources toward? According to a study conducted by Supply Chain Visions and the Warehouse Education and Research Council, it’s all a matter of finding what percentage of an inventory is lost to shrinkage. The average inventory sheds between 0.046 percent and 0.07 percent of its contents to shrinkage, while a score of less than 0.005 percent would be optimal and industry-leading. If, however, a manufacturer’s shrinkage hovers above 1.46 percent, they stand to reap the most out of reining in its inventory.

Waste in avoiding the real problem
Similarly to the aforementioned section, inventories overburdened with spare parts could signal manufacturers aren’t addressing major issues with their on-site assets. If inventories must maintain an above-average stock of replacement components for subassemblies constantly wearing out, supervisors have essentially made a habit out of treating the symptoms and not the disease.

Instead of worrying about additional spare parts, manufacturers should instead focus on the reasons why their valuable assets aren’t performing as they should. Tracking down the primary cause of even the most minor malfunctions will not only save on costs related to managing spare parts, but also completely eliminate the downtime required to replace the worn out component, returning considerable efficiency back to the manufacturer who takes the time to truly investigate.

What challenges do food and beverage companies face when integrating high-pressure processing equipment?

At first glance, high-pressure processing – or HPP – appears to be the answer to a question the food and beverage industry has long sought: What is the best way to process goods based on consumer interest?

HPP technology replicates extremely pressurized environments, thereby eliminating viral or bacterial growth as well as other such biota without introducing goods to high heat or infusing them with additives. This innovation marks a turning point in the industry at a time when an increasingly discerning consumer base has its eyes on how brands process their foods and drinks safely and sustainably.

However, integrating high-pressure processing technology into plant operations can’t be done overnight. In some instances, many internal factors must be accounted for before installing the equipment. Otherwise, businesses run the risk of hurting process efficiency, compromising quality, and creating a wasteful production system.

Process validation always difficult for new technologies
When engineers and food scientists develop breakthroughs in processing goods, validation teams must intervene and ensure these methods or machines act how they were designed to – not just once, but at every unique application. These days, many regulators have their sights on HPP, and for good reason.

According to Food Safety Magazine, HPP processes represent one of several non-thermal approaches to quality control, including cold plasma and high-intensity pulsed light technology, still heavily scrutinized by validating regulators. Not because the process itself is flawed, but rather because it is so new validation specialists still need to compile knowledge before easing up on validation measures across all applications.

To accomplish this, regulators require a lot of operational data to validate HPP processes, perhaps more so than would be necessary if a business simply installed familiar equipment with a more easily verifiable history. For businesses hoping to integrate HPP quickly, advanced data management strategies could provide regulators with the necessary information as completely as possible and at a faster clip. Before high-pressure processing integration, they must be sure the operational data management processes in place are capable of demonstrating the equipment’s potential so lag does not occur.

Possible changes to packaging materials may be necessary
One of the beauties of HPP technology is its ability to process goods after they’ve been packaged. This both enhances the quality of the foods and beverages right up to distribution and has the potential to accelerate cycle times along the way thanks to its instantaneous application of pressure.

That said, certain food and beverage packaging cannot withstand HPP treatment, according to a study by the National Food Lab. Metal and glass, for instance, would break under the intense conditions. Additionally, the National Food Lab specifies any flexible packaging like vacuum seals or plastics must “be compressed by about 15 percent without suffering structural damage and […] return to its original shape upon pressure release” to be considered safe in the long term.

Businesses should therefore research in advance of high-pressure processing technology investment as to whether their current packaging strategy will hold up or need a complete overhaul. Knowing one way or the other could factor into asset purchase and future training courses on new HPP equipment, as well as any new operational requirements for packaging necessary to supplement changes to product processing.

Consider how to maintain product uniformity throughout HPP equipment life cycle
Over time all equipment malfunctions, and HPP assets are no different. For instance, pistons creating the pressurized environment may do so unevenly when not calibrated properly, causing ruptured packages and destroyed goods when too high or potentially passing contaminated products when too low. Food and beverage businesses can look for these effects as signs their HPP machinery might be on the fritz, but by then it may be too late to continue production on time and at volume. Due to the high costs of HPP equipment and their importance to quality control, businesses may only rely on a few machines for their entire operation.

So, what should happen when HPP equipment is found to be technically deficient? Like all other capital-intensive assets on a production line, it needs to be serviced. However, reactive maintenance strategies will only exacerbate issues by bottlenecking production when machinery breaks down. Instead, businesses should establish predictive and proactive maintenance standards, as well as a system for ranking internal assets. That way, maintenance specialists can prioritize HPP repairs over less vital work orders.

At a time when businesses everywhere value continuous improvement, setting goals is now more important than ever. Are your business objectives moving you forward, or are you merely treading water? We’ve put together a list of suggestions for how to set practical goals that actually grow your organization.

Target areas of great potential gain
The Pareto Principle strikes again – by following the time-tested 80/20 rule, business leaders can focus on areas where their companies experience the greatest inefficiency to deliver the most substantial results.

pareto principle

For a quick refresher, the Pareto Principle states about 80 percent of results come from 20 percent of the causes. For example, 20 percent of your customers purchase 80 percent of your wares. While this isn’t a hard-and-fast mathematical certitude, it is an effective foundation for thinking about internal processes. After all, if 80 percent of your cycle time comes from 20 percent of your processes, optimizing that one-fifth will significantly reduce production waste.

“Objectives with low effort and high value make excellent starting points.”

Also, when prioritizing which goals to tackle first, Inc. recommends considering the relationship between the effort expended to accomplish them versus the resulting payoff. Objectives with low effort and high value make excellent starting points for your improvement itinerary.

Always measure, always track
Success isn’t measured in feelings, but absolute figures. Setting goals requires businesses to identify exactly how much on an improvement constitutes a win for the organization, as well as a system by which to quantify those wins. “Better changeover” won’t cut it as an objective, but “a 25 percent reduction in changeover time” will.

Improvements to process, safety, workforce management, financial performance – all of these metrics can and should be monitored if businesses expect to make any headway on goal-setting. Decision-makers should also try to employ the latest operational monitoring technology like low-cost sensors coupled with process management software to get the most accurate results without adding waste elsewhere. Sure, you may have slightly improved your changeover time, but if you achieved it by forcing operators to fill out forms manually, you’re detracting from your gains in the process.

Go beyond the ordinary
These days, businesses avoid risks whenever possible. However, when it comes to setting goals, IndustryWeek contributor and manufacturing expert, Larry Fast, recommends pushing the bar as high as possible:

“Which would you rather have,” Fast writes, “a 25 percent improvement goal that accomplishes 28 percent or a 75 percent improvement goal that ‘only’ yields 67 percent improvement but that jump starts the new way we’re going to manage the business?”

As Fast clarifies, when business leaders only set modest goals, they have a better chance of pulling them off. However, these represent only surface-level improvements accomplished through urgency, not through systemic process changes. When goals challenge organizations to reprogram the ways they’ve approached operations in the past, they undoubtedly invite risks, but they also open up incredible potential for outstanding and sustainable rewards.

You don’t need to whip out a textbook or ask your favorite search engine what process task analysis is – it’s in the name. You have a process made up of tasks and you wish to analyze them. Simple, right?

What might not be as clear cut is the value something as straightforward as task analysis could provide any industry with complex processes, particularly to employees assigned to carry them out. According to a study published in the Journal of Applied Sciences, human error leads to 9 out of every 10 errors in the workplace. In the wrong asset-intensive industry, an honest mistake could lead to machine failure, wasted resources, and perhaps even operator injury.

As such, companies in a continuous state of expanding operations should frequently set aside time to assess if the intricacies of their processes undermine enhancing productivity. And that’s where a process task analysis comes into play. When conducting a process task analysis on operations in your business, the following tips could help reveal more about your operations for greater understanding and adjustment.

Break through the surface to find the real problems with your processes.

Drill down complexity to achieve clarity
Before performing a process task analysis, you’ve no doubt visualized the operations in question using process mapping best practices. In doing so, one thing should become immediately clear once the process has been laid out before you: Are your complex processes too complex?

Innovative businesses or those that take a new approach to their corner of their industry always tread where others haven’t. Trailblazing isn’t easy. However, untenable complexity may spring from companies trying to build out new processes on top existing ones over time instead of examining whether a new component added onto an existing process calls the whole thing into question.

For instance, an organization with disparate data reporting applications for each of its departments may develop and attempt to hone systems for transferring important documentation interdepartmentally. A process task analysis may demonstrate the best course of action would not to merely try to simplify the manual transfer processes per se, but reimagine how the company in question shares information in general. Perhaps instead, the business should find a more expansive enterprise resource planning software that integrates well across all departments, cutting out manual transfers altogether.

“Simple processes can destroy one element of processes optimization: uniformity.”

Don’t underestimate simple operations
All processes, even the little ones, are worthy of the attention of your organization. While business leaders might believe operations with short, linear process maps couldn’t possibly be trouble, they can be particularly destructive to one element of process optimization: uniformity.

If a defined process leaves a lot of room for participant interpretation, it increases the risk of mismanagement and inefficient performance. A process task analysis of these seemingly unassuming processes would reveal they are, in fact, missing major steps to ensure a standardized system adhered to by all workers. Use a process task analysis to uncover ways to flesh out process maps with greater detail and information for more realized operations.

Collaborate with operators on best solutions
For a process task analysis to deliver optimal results, decision-makers can’t mistake surface-level problems for root causes. That’s why workers who perform processes under scrutiny ought to play a pronounced role in the analysis.

Do employees believe repetitive actions performed within a set process impair their productivity? It may be worthwhile to research how automated technologies could ease some of the pressure and reduce human error or fatigue. Additionally, operators may lack a depth of training required to perform their jobs well, rather than simply perform them. The manufacturing sector knows this all too well – according to a 2015 report by the Manufacturing Institute, a widening hiring gap for talented individuals over the next decade may leave hiring managers no choice but to take on less-than-qualified workers and train them on site.

Solutions like this can work and work well, so long as employees all receive the same quality of training. A process task analysis, therefore, could lead business leaders to enhance ancillary support materials, as opposed to more direct components of the processes they assess. You’re not getting off track by working on these resources – you’re building a foundation for better operations to come.