Charlotte, North Carolina October 8-11, 2016

Home
Registration
Hotel
Program

Awards

Committee

General Conference Information

Information for Sponsors/Exhibitors

Local Information

Opening Keynote

Special Event

Sponsors and Exhibitors

Travel Grants

Workshops

Contact


SAIR Proposal


SAIR Conference Program 2016
Tuesday, October 11, 2016
9:15 am to 10:00 am


Accreditation / Work Share

Session 87
Student Achievement Data: An Overview of Institutional Websites


Each institution in the SACS COC region has been asked to provide a public information link to student achievement data on their websites. This session prevents an overview of that data from a variety of institutions with an attempt to identify the kinds of data presented, as well as the scope of the data and the general context provided at the institutional level.

Room: Morehead
Presenter(s):
Ginny Cockerill The University of Alabama at Huntsville




Facilitator: Louise Fisher

Assessment / Work Share

Session 81
Assessing the Assessor: Measuring the Impact of Assessment Offices


Assessment offices have to play the role of “honest broker” in order to build or win institutional trust. These offices often monitor, review, evaluate, and provide feedback to various institutional units but, who monitors the assessment offices? How do they know that they have made an impact? This session will focus on strategies that assessment offices can use to assess themselves and measure their impact on the institution.

Room: Tryon North
Presenter(s):
John Frederick The University of North Carolina at Charlotte
Christine Robinson The University of North Carolina at Charlotte



Facilitator: Meaghann Wheelis

Assessment / Work Share

Session 89
One University’s Process to Align Annual Assessment Reporting and Academic Program Review


• Virginia Tech is currently in the process of aligning two primary responsibilities housed in the Office of Assessment and Evaluation: academic program review and annual assessment reporting. This session will focus on how these two separate processes have been conceptually and strategically linked in order to better support each other and facilitate continuous improvement. Specific examples and detailed information on the design of these two processes will be shared during the session. We will review lessons learned during the revision process and potential pitfalls to be avoided.

Room: Queens
Presenter(s):
Bethany Bodo Virginia Tech
Molly Hall Virginia Tech
Steven Culver Virginia Tech


Facilitator: Sandi Bramblett

Assessment / Paper

Session 91
Faculty Attitudes regarding Assessment


Because assessment involves teaching and learning, faculty buy-in is key to the successful implementation of any assessment program. While faculty attitudes towards assessment affect faculty buy-in, very little research has examined faculty attitudes of assessment at the university level. The current study examined the relationship between faculty knowledge of assessment and positive/negative conceptions of assessment. Faculty at a public university in Virginia were asked to complete the Teachers’ Conceptions of Assessment-III (Brown, 2006), which is a 27 question Likert-type measure that assesses attitudes regarding assessment, and a 25 question true/false exam that measures knowledge of assessment.

Room: Grand Ballroom B
Presenter(s):
Sarah L. Strout Radford University
Sandra Nicks Baker Radford University



Facilitator: Rob Ricks

Institutional Research / Paper

Session 83
Pathways to Completion: An Analysis of the Enrollment Patterns of Recent Baccalaureates in Florida


This paper presents the results of a retrospective look at the enrollment patterns of students who completed their first baccalaureate degree at a Florida public university. We describe general enrollment patterns and the common student characteristics associated with each. We also examine differences in pattern frequency by broad program discipline. Finally, we discuss the implications of these patterns for time-to-completion.

Room: Harris
Presenter(s):
Christy England-Siegerdt Board of Governors, State University System of Florida
Kathy Padgett Board of Governors, State University System of Florida



Facilitator: Justin Chandler

Institutional Research / Paper

Session 84
Calculating and Reporting Effect Sizes, Power and P-Values to communicate the Statistical and Practical Significance of Results


Many submissions to scientific journals fail to report the effect sizes, and power in quantitative studies while prominently listing the P values. In this presentation, the author will explain the relevance of effect size, power and significance testing for planning, analyzing, reporting, and understanding education research studies. Calculations of these techniques are rarely done by hand. Instead, researchers normally refer to tables of critical values in much the same way that tables of critical values for t, F, and other statistics were utilized to determine statistical significance. The aim of this presentation is to clarify these concepts and to provide examples, using G power and SAS applications, on how to calculate and report effect sizes, sample sizes, and p values. The components of sample size calculations will be discussed and what factors to consider in choosing the sample size. Other concepts related to these issues such as sample size, confidence intervals, variability, type I error, type II error, and minimum effect size of interest will also be discussed.

Room: Independence
Presenter(s):
Jamil University of Mississippi Medical Center




Facilitator: Jesse Wrenn

Institutional Research / Paper

Session 90
Signal or Noise? Using Homegrown Data to Predict Attrition


National reports and benchmarking data suggests a multitude of reasons why students may be unlikely to retain at the first college or university they choose to attend. Yet, these standardized reports do not account for institution-specific challenges or efforts. In this presentation, we will discuss how to bring together cognitive, non-cognitive, academic performance measures, and other available institution-level student data to begin to determine how colleges and universities can best predict academic success and retention on their campus. The presentation will include a discussion of data cross-pressures from different areas on campus, how campuses can strategically respond to what retention and student success data illuminates, and how students can be included in framing administrative understanding of this data.

Room: Grand Ballroom A
Presenter(s):
Will Miller Flagler College




Facilitator: Greg Ohlenforst

Institutional Research / Work Share

Session 92
Understanding Completion Patterns of Non-Traditional Students


Students attending American Public University System (APUS) are typically non-traditional learners who often arrive with transfer credits. Since IPEDs graduation rates only consider First-Time students, students may successfully complete an Associates or Bachelors degree but are never counted as graduates from any institution. In addition to better understanding completers, the research team sought to determine if non-completers go elsewhere to complete an academic program. This presentation will review the process for retrieving student records from the student information system, obtaining data related to these students from the National Student Clearinghouse, analyzing it using Tableau, and a summary of the findings.

Room: Brevard
Presenter(s):
Elizabeth Wallace American Public University System
Dave American Public University System



Facilitator: Jessica Pierce

Institutional Research / Paper

Session 85
Using the National Student Clearinghouse, The CIRP Freshman Survey, and Institutional Data to Help Answer and Prompt More Questions about Graduate School Attendance.


Do certain activities lend themselves more directly to attending graduate school? Still, how impactful are pre-entering characteristics (such as parental education or self-expectations) with those activities and graduate school attendance? This single institutional study considers the following experiences: (1) study abroad, (2) internships, (3) leadership positions, (4) undergraduate research, and (5) service and with the CIRP Freshman Survey merged with the results from the National Student Clearinghouse (NSC) to provide additional insight into graduate school attendance. Issues and concerns are shared and discussed. In addition to the processes employed, practical coding is provided.

Room: Sharon
Presenter(s):
Robert Springer Elon University




Facilitator:

Planning / Paper

Session 86
Life after College: The institutional obligation to faculty and staff


College leaders not only serve as stewards of their students’ intellectual, vocational and avocational futures, but also serve as stewards of the campus faculty and staff’s professional and financial futures. The presenters will review the impact of the introduction of defined contribution plans (401K) in the 1980's rather that defined benefit plans (pensions) on retirement incomes. Inequities are identified and recommendations on funding and employee support are proposed.

Room: Trade
Presenter(s):
Jim Purcell Office of the Postsecondary Commissioner
Erin Hall Office of the Postsecondary Commissioner
Philip Brodeur Office of the Postsecondary Commissioner


Facilitator: Jon Acker

Technology / Work Share

Session 88
Data Visualization of National Survey of Student Engagement (NSSE) in Tableau for Academic Program Review


Our institution underutilizes survey data for academic program review and administrators desire to increase utilization by employing Tableau to create dashboards. National surveys like the National Survey of Student Engagement (NSSE) can provide indirect evidence for student learning outcomes for academic program review (APR). The following presentation will 1) describe the data cleaning process 2) the process of creating a dashboard utilizing four years of NSSE survey data and 3) involve the audience creating small groups for a visual analysis exercise using a dashboard. Participants from other institutions will have a road map for how to use dashboards for APR.

Room: Kings
Presenter(s):
Alicia Dean Auburn University at Montgomery




Facilitator: Carmen Allen

Sessions: 11

SAIR 2016 Charlotte