Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

Tip

Rylee Manning and Maria Quinones Vieta to add spreadsheets and descriptions here

Reliability Ratings Sheet: https://utexas.box.com/s/ptixovlw80lp63kmi25cf6vqeeh6vo0a

Date

Attendees

Agenda

Notes, decisions and action items

Stephanie Grasso Maria Quinones Vieta Rylee Manning

Manuscript:
Methods Section

  • Target journal: AJSLP

    • AJSLP journal guidelines are in Box folder

  • Organize the Methods section similar to the Scoping review and the ANCDS review

Criteria Section

  • Create a table with search terms (similar to the one in Scoping Review)

    • Use Excel to create the tables; use different tabs in the spreadsheet for each table

Data Items Section

  • Reference the ANCDS review and only briefly summarize the main items that were rated. Mention that it is already published and open access. Then, include describe the items that were rated for our review in more detail.

Results Section

  • Report general features of the studies and our newly extracted info

Tasks

  •  Rylee Manning will finish double-checking her ratings for papers 146-149.
  •  Once all data for the ratings are included in the spreadsheet, Maria Quinones Vieta and Rylee Manning can check final reliability scores for all reviewers
  •  Rylee Manning and Maria Quinones Vietawill establish mutual consensus for papers where reviewers showed discrepancy for Explicit vs. Implied Criteria 
  •  Next week, Stephanie Grasso Maria Quinones Vieta Rylee Manning will look at examples of papers with ambiguity regarding eligibility criteria (i.e. whether it was implied or explicit) and decide on tie-breakers for these instances
    •  If we see a pattern wherein differences in ratings between implicit and implied criteria cannot be clearly attributed to specific raters, we will need to re-review the explicit vs. implied criteria for all studies since this is a central component of our study
    •  Examples of what we consider explicit is they state that the features are part of their inclusion/exclusion criteria OR they are discussing their inclusion/exclusion criteria preceding or following the description of these features
      •  e.g., Participants were monolingual English speakers. In addition, other inclusion criteria were…
      •  e.g., Inclusion criteria including the absence of another neurological condition. Participants were also all monolingual English speakers.
      •  Implied: All participants were monolingual, right handed, below 80 years of age.

Rylee Manning Maria Quinones Vieta Stephanie Grasso

  • Rylee created a rough draft of some tables to include in the manuscript

    • Marifer to verify search terms, Dr. Grasso to review as well

  • Discrepancies & Reliability

  • Preliminary results to be included with ICAA poster

  • Reference Abstract and Manuscript draft

  • Use KC & GY’s poster as a guide

  •  Maria Quinones Vieta will compile data from individual reviewers into one sheet and add a column for Rater Number
  • We identified the problem of redundancy from our initial consensus review

    • Compile reviews into one sheet.
      in that sheet, identify the double ratings and delete the redundancy.

    • Copy that sheet and delete the double ratings.

  • Columns BE - BL (highlighted in blue)

    • Rylee will review for redundancy

      • where the data in these columns is redundant, Rylee will edit, highlight and then copy-paste the new data into the Consensus sheet

  • New papers to rate (28)

    • Initially we had considered all of the studies including Ana Volkmer's as ANCDS because they were in the master table, then the newly found studies were assigned to Lucas with the video instructions. And then Ana's studies were identified as not part of the initial review. papers that did not qualify based on our criteria were deleted.

      •  Lucas will rate 105,108,111,112,113,114,115,116,117,118,119
      •  Rylee Manning will rate121,122,123,124,125,128,129,130,131,132,133
      •  Maria Quinones Vieta will rate135,136,137,138,139,140,141,142
  • Get in touch with Lucas to inform him about the redundancy pattern and make sure he will enter data in the correct way

Rylee Manning Stephanie Grasso

Preparing the spreadsheet without Double Ratings for data analysis (to present preliminary findings at ICAA)

  • Dr. Grasso edited the No Double Ratings spreadsheet

  • Rylee Manning will delete red papers (i.e., excluded papers indicated in red in the Master Table sheet) from the spreadsheet without double ratings

    • then, Rylee will insert data for the following variables in the papers that do not already have it in the sheet:

      • Number of Participants

      • Age

      • Years Post-Onset

Rylee Manning Maria Quinones Vieta

ICAA Poster

  • Dr. Grasso created the figures for Number_Participants, Languages_Spoken, and figure by Country

    • figure for Race_Ethnicity to be ready on Sunday 9/22

  • Rylee updated Languages_Spoken to indicate the number of studies (n=149)

    • also inserted white text box to cover “final adjustments” in figure by Country

  • Rylee and Marifer made additional poster edits

    • formatting and captions

    • Rylee will discuss with Dr. Grasso on Monday 9/23 before sending the poster to the printer

  1. Rylee’s reviews are nearly complete; Lucas is making progress; Marifer is just getting started

  2. Rylee worked on methods

    1. Notes on introduction as well

  3. Rylee will be working on redundancy and we will touch base on progress

  4. Inter-rater reliability: Marifer will calculate this between raters when the final ratings are completed, which should account for redundancies being removed- in other words we want each of the raters “final ratings” to not have the redundancies in them prior to conducting the IRR.

    1. Copy and paste columns we identified as having redundancies and then Marifer will recalculate (in columns BE - BL)

    2. After we work through IRR, we establish final consensus, and use those consensus ratings to replace the final ratings used for the data reported in the paper

      1. Final datasheet will be the No Double Ratings sheet used from the poster BUT we are re-creating it to have all the ratings and changes to ratings made during the consensus process (so we will delete the old No Double Ratings spreadsheet, in order to replace it with the updated version from the steps outlined above)

...