Method & Results for State of Surgical Video Report

Below, the methods used for each component of the "Report on the State of Surgical Video" are explained, followed by a table presenting the quantitative results from this study. If you have any questions or concerns, please do not hesitate to email us at contact@jomi.com

Conflicts of interest: The author is a Junior Medical Editor and partner at the Journal of Medical Insight

Full Report is available at https://jomi.com/report-state-surgical-video/

Last Updated: July 29, 2015

Methods:

A sample size of 15 videos were randomly selected from each of the resources below. Where multiple specialties were available, care was taken to ensure the sample videos included all possible specialties. These 15 videos were used to evaluate the quality and, in part, the value and content of each resource.  The individual procedures used for each of the five categories are outlined below.

Type of Content:

The content of the 15 sample videos provide the primary source for this category.  Other forms of content were added based on exploration of each website and any additional information contained therein.

Specialties Covered:

In most cases, the website listed the specialties covered by their videos.  Otherwise, this section was determined by perusal of the videos available, especially the 15 randomly-selected sample videos.

Users:

In many cases, the target audience is clearly stated within each site.  In other instances, there are specific requirements to make an account.  When this type of information was unavailable, the author made an educated guess as to the typical user profile based on his impressions of each of the sample videos.  The focus and depth of narration is particularly useful in determining the intended audience.

Volume:

For most of the resources below, the total number of videos is readily advertised or easily calculated. However, when this information was not available, the author estimated the volume of videos based on page length, number of specialties, and volume of procedures within each specialty.  A volume rating was assigned based on the total number of surgical videos provided by each resource, according to Table 1.

Table 1: Volume Rating System

Total Videos
Rating
≥ 1000
5
≥ 500
4
≥ 250
3
≥ 125
2
< 125
1

Quality:

To evaluate the quality of the videos hosted by each resource, the American Academy of Orthopedic Surgeons (AAOS) guidelines for Peer Review/Evaluation of Video Submissions were applied to the 15 sample videos. The AAOS guidelines include 28 points for evaluating surgical videos.  However, in many cases, the number of points relevant to an individual video varied. This was especially true for general surgery videos.  Therefore, each video was evaluated based on the percentage of relevant points that were satisfied.  For example, a certain video might meet 15 out of 20 relevant points, scoring a 75%. Then, the mean and standard deviation of this percentage was calculated across the 15 sample videos.  A rating, marked on a scale of one to five, was assigned to each resource, based on Table 2.

Table 2: Quality Rating System

Mean Score
Rating
≥ 85%
5
≥ 70%
4
≥ 55%
3
≥ 40%
2
< 40%
1

Value:

The value rating, marked on a half-integer scale from 1-5, was determined by the fulfillment of ten conditions.  Each condition satisfied contributes 0.5 to the value rating.  The conditions are as follows:

  1. Resource includes a peer-review process
  2. Majority of articles includes an adequately detailed written component
  3. Resource has a quality rating ≥ 3
  4. Majority of articles includes a didactic narration
  5. ≤ 25% of videos were submitted before 2005
  6. Resource has volume rating ≥ 3
  7. Resource includes Editorial Board
  8. Resource offers Continuing Medical Education (CME) credit
  9. Resource avoids promotional content/industry influence
  10. Resource includes at least 5 specialties

Although it is theoretically possible for a resource to score a zero for the value rating (i.e. by failing to satisfy all of the conditions), such an issue was not encountered during this study.

Results:

The quality, value and volume ratings assigned to each of the online resources are summarized by Table 3 below. Subsequent sections also include the type of content, specialties covered, and the typical user profiles for each of the resources studied.  Pro/Con lists for the quality and value sections are provided to supplement the numerical ratings.  A brief background on each site is also included.

Table 3: Quantitative Results

Resource
Mean Quality
Standard Deviation
Quality
Value
Volume
ClinicalKey
74.57%
9.11%
4
4
1
JoMI
92.37%
5.99%
5
3.5
1
MedlinePlus
55.37%
6.23%
3
3
4
ORLive/BroadcastMed
49.32%
8.17%
2
2.5
2
SurgeryTheater
32.84%
10.55%
1
1.5
5
VJOrtho
60.07%
6.52%
3
2
2
VuMedi
44.42%
10.70%
2
2.5
5
WebSurg
41.22%
10.58%
2
2.5
5
YouTube
37.86%
9.51%
1
1.5
5

The table above shows the Mean Quality and Standard Deviation scores for each of the resources examined, determined by the AAOS guidelines and averaged across the 15 sample videos.  The corresponding Quality rating, as determined by Table 2, is shown.  The Volume (Table 1) and Value ratings are provided as well.




Maintained by Joe Serino