Mythbusters: ECC Nursing Myths and Facts
European Veterinary Emergency and Critical Care Congress 2019
Kenichiro Yagi, MS, RVT, VTS (ECC, SAIM)
College of Veterinary Medicine, Cornell University, Ithaca, NY, USA

Is heparinized saline necessary to keep catheters patent?

Proper maintenance of peripheral intravenous (IV) catheters is critical in the treatment of patients in critical care, used for fluid delivery, drug administration, blood product transfusion, and parenteral nutrition. The best method in maintaining the catheter is of interest, including catheter patency, maintenance protocol, and dressing methods. Occlusion of IV catheters is a common complication that necessitates replacement of the catheter and leads to additional patient discomfort and medical care cost. While catheter material and patient-related factors can be contributors to clot formation, one of the key elements to maintaining patency has been flushing of the catheter with heparinized saline. The use of heparinized saline has associated concerns including coagulopathy, drug incompatibilities, allergic reactions, thrombocytopenia, and thrombosis syndrome.

The first veterinary study conducted to determine whether there is any difference in effectiveness between heparinized saline and normal saline compared the use of 10 IU/mL heparinized saline with 0.9% sodium chloride. An 18-ga catheter of 1.25 inch in length was placed in each test subject that were separated into three groups. The first group had their catheters flushed with heparinized saline every 6 hours throughout a 42-hour period. The second group had their catheter flushed with normal saline every 6 hours throughout a 42-hour period. The third group served as a control group used to determine the amount of time it took for a catheter to clot if it were not flushed. Blood was attempted to be aspirated from the catheter prior to each flushing, and also evaluated for any signs of phlebitis.

The study observed that all catheters of both treatment groups were able to be flushed without any resistance or occlusion. The number of catheters that allowed for blood to be aspirated back were higher in the heparinized saline group, but the difference was not statistically significant (9 of 12 vs. 5 of 12 at 42 hours, p=0.065). No signs of phlebitis were seen in any group. The authors concluded that the use of heparinized saline flushes did not yield benefits when compared to 0.9% sodium chloride in maintaining peripheral 18-ga catheters in a 42-hour period.

This veterinary study follows some of the human studies observing that intermittent flushing of IV catheters with normal saline is as effective as flushing with heparinized saline. It also suggests that 18-ga IV catheters might not require any flushing at all for the first 24 hours. There are some studies, however, that observed heparinized saline as being superior, muddying the waters on the issue. Some other limitations should be considered, such as the duration of the study being 42 hours. Many catheters in critical care settings are used longer than 42 hours, making information beyond 42 hours desirable before heparinized saline is replaced with normal saline completely for flushing. The patency of the catheter was determined through a qualitative evaluation of resistance by the investigators, and so objective measurements of clot formation were available. The study also evaluated a single size of catheter, and the data’s applicability to other sizes and lengths are uncertain. For example, studies surrounding heparinized saline use in central venous catheters currently provide even less definitive conclusions due to the variability in maintenance protocols in regards to heparin concentration and flushing frequency. Other factors within the study that can be different from clinical situations include the catheter diameter as well as the disease state of the patient (hypercoagulable patients could be more prone to catheter occlusion). A future study of longer duration measuring the effects objectively is desirable to shed more light on the topic.

Should IV catheters be replaced routinely?

Hospital protocols often recommend replacement of IV catheters in a patient every 72–96 hours as it is thought to reduce the risk of phlebitis and bloodstream infections. The US Centers of Disease Control guideline recommends replacement no more frequently than 72–96 hours. Routine replacement of IV catheters exposes the patient to additional stress and discomfort, venipuncture, and restraint. It also adds financial burden to the owners or at the very least increased staff time and supply demand to the hospital. More recently, many practices have instituted protocols calling for catheter replacement only when clinically indicated, attempting to alleviate the morbidity and costs associated with routine replacement. An assessment of the effects of the two approaches would be beneficial in setting hospital protocols.

Numerous studies related to the topic have been conducted on human subjects, being summarized as a systematic review through the Cochrane Collaboration. The systematic review summarized that there was no significant difference in occurrence of catheter-related bloodstream infection (CRBSI) for clinically-indicated or routine replacement with 1 of 2,365 and 2 of 2,441 patients, respectively (p=0.64). There was no difference in phlebitis seen with 186 of 2,365 cases in clinically-indicated replacement and 166 of 2,441 cases seen in routine replacement methods (p=0.75). Significant reduction of catheter placement costs were seen in the clinically-indicated group, of approximately AUD 7.00. The reviewers concluded that there is no clinically significant difference between clinically-indicated replacement and routine replacement of peripheral IV catheters.

Because there is no difference seen between the two methods, a recommendation can be made to adopt a protocol to replace peripheral IV catheters only when clinically-indicated. Patients will avoid being subjected to unnecessary pain and the clients and practice will not incur unnecessary drain on resources. There is currently no veterinary evidence available to provide insight in our practice. There could be differences between species or practice setting such as the higher tendency for veterinary patients to soil or tamper with the catheter insertion site, and any unexpected differences in physiology. With that said, many practices have instituted a clinically-indicated replacement approach without subjective increases in complications. If the clinically-indicated replacement approach is taken, structured protocols on routine inspection of the catheter site of at least every 24 hours for signs of inflammation, infiltration, occlusion, or infection should be followed.

Do antimicrobial impregnated central venous catheter supplies prevent CRBSI?

Central venous catheters (CVC) are often used in critical care for variety of reasons including blood sampling, central venous pressure measurement, infusion of high osmolarity fluid, simultaneous infusion of incompatible drugs through multiple lumens, and parenteral nutrition. A major concern with placement and maintenance of CVCs in a patient is the possibility of CRBSI adding to patient morbidity and mortality. A variety of strategies have been adopted to prevent this common complication, including catheter maintenance bundles, antimicrobial treatment of the catheter, and antimicrobial treatment of catheter insertion site or dressings.

A Cochrane review of antimicrobial-treated (AMT) (antiseptic or antimicrobial impregnation, coating, or bonding) CVCs compared the difference between AMT and non-AMT CVCs, the difference between the effect of antimicrobial impregnation and antimicrobial modification (antiseptic dressing, hubs, tunneling, needleless connectors, etc.), and any differences between identifiable subgroups such as length of catheter use and practice setting. The review consisted of 16,784 catheters and 11 impregnation types. The review summarized that catheter impregnation significantly reduced CRBSI. However, it did not reduce the incidence of sepsis, mortality, and catheter-related local infections. There were significant benefits seen in ICU settings when compared to haematological and oncological units or CVC use for parenteral nutrition. AMT did not affect the incidence of other adverse events such as thrombosis, thrombophlebitis, bleeding, erythema, or tenderness at the insertion site.

A review of antimicrobial dressing in CVC placement in human infants saw that chlorhexidine dressing/alcohol skin cleansing reduced catheter colonization in a similar manner to polyurethane dressing/povidone-iodine cleansing, and was no different in its effect on sepsis and CRBSI. Chlorhexidine dressing seemed to cause higher incidence of contact dermatitis, however. Silver-alginate patches did not cause adverse effects, but their efficacy is unclear. A separate review of dressing and securement devices for CVCs evaluated various devices and their effect on CRBSI, catheter colonization, site infection, skin colonization, skin irritation, failed securement, dressing condition, and mortality. The review summarized that chlorhexidine gluconate-impregnated dressing reduced incidence of CRBSI and catheter tip colonization when compared to standard polyurethane dressing. Medication-impregnated dressing reduced CRBSI rate when compared with non-impregnated dressing. Of all, the use of sutureless securement device was the most effective and chlorhexidine gluconate impregnated dressing the second most effective in reducing CRBSI.

While the evidence evaluated by these reviews is in human subjects, some messages can be extracted for potential benefits in the veterinary field. The use of AMT CVCs might not be as effective as theorized as incidence of sepsis and mortality was not significantly different. The use of antimicrobial-impregnated dressing should be encouraged, and implementation of sutureless securement devices explored. Implementation of antimicrobial-impregnated dressing is a relatively inexpensive intervention available, and should be considered if current protocols include the use of standard polyurethane dressing or gauze.

Should patients with gastroenteritis be fasted?

Patients exhibiting gastroenteritis with signs of vomiting and diarrhoea are often placed on a nil per os (NPO) nutritional plan as it is considered to be beneficial for the patient. The reasoning behind this thought are various. One of which is the resting of the bowels by minimizing stimulation for contractions, reducing faecal volume, and frequency of defaecation. Another is to reduce the chance of vomiting due to stimulation through distension of the stomach. By fasting, the vomitus is thought to contain less nutrients that can increase the chances of bacterial proliferation and pneumonia if aspirated. The presence of undigested food in the gastrointestinal system is also thought to have detrimental effects such as promoting bacterial proliferation and secondary infections, or inducing osmotic effusion into the gastrointestinal lumen leading to exacerbation of diarrhoea. Offering of food while a patient is nauseated can also lead to food aversion, contributing to the delay in regaining of appetite when the patient is feeling less ill. Because of these reasons, a traditional approach to gastroenteritis is to withhold food for 24–72 hours before offering food.

However, there are numerous veterinary studies that support the institution of enteral nutrition early in the stages of hospitalization. A study involving patients with haemorrhagic gastroenteritis having a hydrolyzed protein diet introduced early in hospitalization observed that it did indeed increase the frequency of vomiting, but only initially. These patients saw a reduction in the frequency of vomiting and regained tolerance to feeding within 2 to 3 days. It is thought that the introduced food serves as a prokinetic and thus reduces the amount of vomiting when compared to a fasted state. Another study involved patients with parvoviral enteritis being split into a group that was fasted, and a group that was given enteral feeding. The investigators observed that patients that were fed stopped vomiting significantly sooner than patients that were fasted, leading to the conclusion that early enteral nutrition is beneficial for cessation of vomiting. However, food-high in fat, soluble fibre, or poorly digestible starch can promote emesis instead. Gastric distention does contribute to stimulation of vomiting as well. With these points in mind, feeding small, low fat meals frequently is recommended.

Providing nutrition early will also prevent patients from experiencing vigorous peristaltic action that is described by people as “hunger pains,” as the presence of food promote normal peristaltic action. The presence of volatile fatty acids such as propionic acid and butyric acid provides an acidic environment in the gastrointestinal lumen suppressing the proliferation of pH sensitive pathogens such as Campylobacter and Clostridium spp. likely having some beneficial effects in preventing secondary bacterial infections. In terms of structure of the gastrointestinal mucosa, fasted animals experience a reduction in villous height and crypt depth, decreased antioxidant content in mucosal tissues, and increased induction of enterocyte apoptosis. The gastrointestinal mucosa provided food will instead experience healthier mucosal turnover and strengthening of the mucosal barrier. The gastrointestinal mucosa seems to rely on luminal nutrients to passively obtain glutamine, amino acids, essential fatty acids, folate, zinc, vitamin A, and vitamin B12, which are all necessary for healthy mucosal turnover. Each of these factors serve to reduce chances of bacterial translocation in patients provided nutrition. Presence of luminal nutrients also reduce the expression of adhesion molecules and subsequent neutrophil sequestration and activation, and keeps the function of T and B lymphocytes to produce IgA and cytokines intact, providing benefits to immunologic functions.

These reasons support providing enteral nutrition as soon as fluid deficits are replenished. Many negative effects of feeding can be alleviated through providing smaller amounts of a highly digestible diet that is low in fat. Other evidence supports the importance of earlier nutritional intervention in many critical illnesses.

Nasogastric tube or nasooesophageal tube?

Nasoenteral tubes are used in hospitalized patients to provide enteral nutrition with liquid diets on a short-term basis, especially when anaesthesia is undesirable. Nasoenteral tubes can be inserted to be terminated either in the oesophagus or the stomach, called nasooesophageal (NE) and nasogastric (NG) tubes, respectively. Each of these tubes are associated with shared complications such as epistaxis, dacrocystitis, rhinitis, aspiration pneumonia, occlusion of tubes, diarrhoea, vomiting or regurgitation, and unintended removal.

The selection of NE versus NG tube placement is a choice presented to the veterinary team. NG tubes had been avoided by some because of the potential for an increase in the risk of regurgitation, gastrooesophageal reflux, and resultant oesophagitis or stricture as the tube being placed across the lower oesophageal sphincter prevents full closure. NE tubes will allow these risks to be circumvented, though the potential for unintended displacement of the tube might be increased, and NE tubes will also deny the ability to decompress the stomach or measure gastric content that NG tubes provide. The optimal type of nasoenteral tube chosen has been largely up to the clinician’s preference.

A retrospective veterinary study evaluated the incidence of complications between the two methods to determine any advantage of one over the other. The study evaluated the occurrence of complications including epistaxis, vomiting, regurgitation, diarrhoea, clogged tube, tube malpositioning, aspiration pneumonia, hyperglycaemia, and refeeding syndrome. The study also evaluated differences including feeding method (bolus vs. CRI), amount fed (% RER), and administration of medications by tube. The study observed no significant difference of complication rate between NE and NG tubes, nor other factors (feeding methods, amount fed, and medications).

The lack of a difference seen in the study makes us think that there is likely no difference between the placements of NE or NG tubes. While there is a possibility that subclinical oesophagitis existed, there were no patients that showed clinical signs of oesophagitis. There was a significantly higher amount of deaths seen in patients receiving NG tubes, though this is likely to be attributed to NG tubes being utilized in more critically ill patients and an artifact due to the retrospective nature of the study. Because NG tubes provides the benefit of allowing gastric decompression and there seemingly being no clinical signs of resultant oesophagitis, clinicians should be feeling less hesitation on using NG tubes over NE tubes.

Can RBCs be given through an infusion pump?

Whether there is an optimal method of red blood cell transfusion administration has been a point of discussion. Studies evaluating the effect of various administration methods on the integrity of blood cells exist, focused on the in vitro effect of infusion pumps, measuring the degree of free RBC content (free haemoglobin, potassium, lactate dehydrogenase, bilirubin) and osmotic fragility. The results vary from observing significant increases to insignificant increase in values, while transfusions with red cells with longer storage time resulting in a larger increase of haemolysis markers than those with shorter storage times. The variability in results, in addition to the anecdotal evidence of patients benefiting from RBC transfusions administered with infusion pumps are a cause for varying opinions.

A study assessing in vivo survival time of RBCs infused with various infusion methods, compared the use of gravity flow, volumetric peristaltic pump, and syringe pump in autologous transfusions in dogs. Blood was collected from 9 healthy dogs, washed, and separated into 3 portions labeled with different densities of biotin. These labeled red cells were transfused through either gravity flow with a 170–260 µm filter, volumetric peristaltic infusion pump with a 170–260 µm filter, or a syringe infusion pump with an 18 µm aggregate filter at 2 ml/kg/h. Blood was sampled from test subjects at day 1, and every 7 days until day 49, measuring the proportion of red cells with biotin labels through flow cytometry. Additional in vitro testing was conducted, measuring plasma haemoglobin and osmotic fragility testing.

Labeled RBCs infused through gravity flow, volumetric pump, and syringe pump were detectable in 100% (8/8), 50% (4/8), and 14.3% (1/7) samples, respectively post-transfusion. The quantity and half-life between RBCs infused by gravity flow and volumetric pump that were detectable (4/8) were not different. The RBCs infused via syringe pump detected at 24 hours post transfusion was no longer detectable at 7 days, indicating complete removal of those cells from circulation sometime between 24 hours and 7 days post transfusion. No differences were seen in in vitro values examined.

The study concluded that delivery of RBCs with a syringe pump and microaggregate filter is associated with significant decrease in in vivo survival time. Volumetric pump delivery was associated with a 50% probability of loss of transfused RBCs within the first 24 hours, and gravity flow allowed for highest chance of RBC survival. The reason behind this difference is speculated to be the mechanical shear damage to the RBC membranes when transfused through the microaggregate filter, causing preferential removal of damaged cells upon entry into the circulation and exposure to the mononuclear phagocytic system. Though unconfirmed, there is a potential for microclots to have formed in the blood during resuspension in sub-room temperature plasma, which placed a higher degree of shearing stress on the RBCs going through the filter, causing this effect. Early denaturation and oxidation of haemoglobin due to the mechanical stress induced by syringe pump and volumetric pump methods, leading to IgG binding to the red cell surface and removal from circulation, is another possible cause for early removal

Small sample sizes limiting the power of the results is a common limitation in the veterinary field, and this study is no exception. The results are most relevant to exact methods used in the study, and we can only make speculations on alternate setups to remove the use of microaggregate filters with the syringe pump (use of an in-line pediatric 170–260 µm filter or extraction of blood through a 170–260 µm filter administration set into a syringe, for example).

The authors of the study recommended against using a syringe pump with 18 µm aggregate filters in the light of the results of their study, though considering the limitations, drastic changes to clinical protocols was not stated to be necessary. The current best practice considering this evidence would be to administer blood products via gravity flow for larger volume, higher flow rate transfusions as long as consistency in flow rate is monitored closely (as it can be influenced by catheter patency, positioning and motion by the patient, and amount of blood left in the bag). The syringe pump method is particularly useful when performing small volume transfusions such as in felines. A similar study performed with feline blood stated their observation of RBC survival time being unaffected by the syringe pump method.

There are a couple of infusion pumps approved for blood product, one of which is an internal approval, and the other of FDA approval for human blood products. These pumps could be the next best solution and validation with veterinary blood products is warranted.

Does premedicating reduce chances of reactions?

Premedication, or administration of antihistamines, glucocorticoids, or antipyretics in anticipation of immunologic complications to counter histamine and inflammatory mediators and suppress the effects, have been a traditional practice in transfusion medicine. There are a number of human studies observing no difference in incidence of type I hypersensitivity reactions (allergic reaction) or febrile non-haemolytic transfusion reactions (FNHTR). Some clinicians reason that administration of premedication potentially masks early symptoms of immunologic complications delaying required interventions for treatment, advocating against it. Evaluation of the difference in severity between recipients with premedication or without premedication has not been performed, and remains a question whether this reasoning is valid. Human evidence is unfortunately not always directly translatable into veterinary practice, though expectations of similar physiological mechanisms exist. A recent veterinary retrospective study evaluating the effect of premedication on acute transfusion-related reactions saw no beneficial effect. There might be a beneficial effect to administration of diphenhydramine in decreasing chances of acute allergic reactions, though further studies were recommended by the authors since the incidence of allergic reaction in the non-premedicated group was already low (2.6%). Studies evaluating effects of premedication and efficacy in prevention of haemolytic transfusion reactions are not apparently available, and the theoretical benefit is no justification for forgoing proper compatibility testing.

References

1.  Bradford NK, Edwards RM, Chan RJ. Heparin versus 0.9% sodium chloride intermittent flushing for the prevention of occlusion in long term central venous catheters in infants and children: a systematic review. Int J Nurs Stud. 2016;59:51–59.

2.  Liu DT, Brown DC, Silverstein DC. Early nutritional support is associated with decreased length of hospitalization in dogs with septic peritonitis: a retrospective study of 45 cases (2000–2009). J Vet Emerg Crit Care. 2012;22(4):453–459.

3.  Mohr AJ, Leisewitz AL, Jacobson LS, et al. Effect of early enteral nutrition on intestinal permeability, intestinal protein loss, and outcome in dogs with severe parvoviral enteritis. J Vet Intern Med. 2003;17:791–798.

4.  Ueda Y, Odunayo A, Mann FA. Comparison of heparinized saline and 0.9% sodium chloride for maintaining peripheral intravenous catheter patency in dogs. J Vet Emerg Crit Care. 2013;23(5):517–522.

5.  Webster J, Osborne S, Rickard CM, New K. Clinically-indicated replacement versus routine replacement of peripheral venous catheters. Cochrane Database Syst Rev. 2015;14(8):CD007798.

6.  Will K, Nolte I, Zentek J. Early enteral nutrition in young dogs suffering from haemorrhagic gastroenteritis. J Vet Med Ser A. 2005;52(7):371–376.

7.  Yu MK, Freeman LM, Heinze CR. Comparison of complication rates in dogs with nasoesophageal versus nasogastric feeding tubes. J Vet Emerg Crit Care. 2013;23(3):300–304.

 

Speaker Information
(click the speaker's name to view other papers and abstracts submitted by this speaker)

Kenichiro Yagi, MS, RVT, VTS (ECC, SAIM)
College of Veterinary Medicine
Cornell University
Ithaca, NY, United States


MAIN : Saturday Nursing : ECC Nursing Myths & Facts
Powered By VIN
SAID=27