your ads here

advertisement

Agreement Critically Ill

jeudi, avril 8, 2021

In the current study, we found that the Interobserver agreement on ICUAW was good for patients who could participate in MMT, particularly patients assessed in intensive care after discharge. The CI 95% of Cohen kappa`s statistics are broad, including both poor and near-perfect match, as the incidence of ICUAW was low and our sample was small. Therefore, we cannot believe that the Interobserver agreement is as excellent as has been reported so far, particularly for intensive care assessments. Interobserver`s agreement on the amount of MRC as a continuous result was rarely perfect and differed by 10% or more in nearly a quarter of patients. The interobserver chord for individual muscle groups was bad, especially for proximal muscles. Proxic muscles, especially hip bending, were most likely not to be evaluated. In a recent study in medical intensive care patients receiving 5 days or more of mechanical ventilation, Ali et al. [7] took 174 patients and only 38 patients (22%) 2000 mmT were unable to perform. However, 94 patients were excluded because they were « likely to wake up » and another 40 were excluded for inability to communicate. As such, 50% of potential patients were not considered due to cognitive inability to collaborate with voluntary tests. At the time of the first evaluation, the authors did not provide data on the location of intensive care patients in relation to the inpatient unit. They indicated that most patients had been evaluated on the day or after the end of mechanical ventilation (NA Ali, personal communication, April 2009). The Ali et al.

[7] study contained an evaluation of the inter-obalised agreement between two observers who examined 12 patients. They reported a perfect consistency on the diagnosis of ICUAW, but did not present the timing or location of these 12 evaluations. Each observer identified six patients with an ICUAW (MRC-Sum score <48), with an incidence of 17% (IC 95%, 3% to 31%). Among all patients, the Interobserver agreement was 93% (Kappa Cohens – 0.76; 95% CI, 0.44 to 1.0).

Catégories

Non classé