SAIR Conference Program 2017
Monday, October 09, 2017
9:30 am to 10:15 am
Assessment / Paper
Analysis of the Effects of an Institutional Campaign to Increase Student Course Load on Second-Semester Retention Rates
The purpose of this study was to investigate if and how second-semester retention rates varied before and after initiation of an academic engagement campaign at a public research university. Students who took 12 credits were 162% more likely to withdraw than students who took 15 credits. Overall, the campaign geared towards increasing FTIC students’ first-semester credit load did not have any adverse effects on students’ retention as independent samples t-tests indicated no significant difference in retention rates before and after the campaign. Further analysis will examine the role of student characteristics (e.g., citizenship, student major, etc.) in retention rates.
Room: Live Oak V
Galiya Tabulda Florida State University
Emily Daina Saras Florida State University
Smriti Ingrole Florida State University
Facilitator: Robin Logan University of the Incarnate Word
Assessment / Work Share
Best practices for maintaining meaningful, flexible, and sustainable assessment practices
Assessment is a process which encourages academic degree programs to continuously improve the quality of the education they are providing to students. However, faculty often associate assessment with accreditation and rules. It is often seen as a bureaucratic process by which faculty submit hollow interpretations of unreliable student learning data. Assessment does not have to be an inefficient process. Providing academic degree programs with articulated expectations via an assessment rubric, allowing programs to have ownership over their assessment activities with a flexible timeline and flexible submission process, and providing programs with formative feedback on the quality of their assessment processes can lead to meaningful and purposeful discussions about student learning.
Room: Trinity Central
Katie Boyd Auburn University
Rebecca Jones Auburn University
Facilitator: Lynne Crosby Austin Peay State University
Community College / Panel
Voluntary Framework of Accountability - how community colleges are using better data to drive institutional improvement
The American Association of Community Colleges will lead a discussion showcasing the importance of having an appropriate, mission-aligned set of student progress and outcomes metrics. AACC will kick-off the session with an exploration of the metrics in the Voluntary Framework of Accountability (VFA) - the first national measurement framework created for community colleges, by community colleges. A current VFA participant will highlight how they are using their VFA data to shape and drive institutional improvement toward student success goals, on-campus efforts aimed at improving practice and pedagogy and increasing student success.
Room: Elm Fork II
Kent A. Phillippe American Association of Community Colleges
Patrick Sanger Alvin Community College
Facilitator: Rene Cintron Louisiana Community & Technical College System
Institutional Research / Work Share
The Impact of Freshmen Retention Rates on Three-Year Community College Graduation Rates:A Regression Analysis Using Panel Data
Based on a study of a decade's worth of longitudinal data for close to 900 community colleges, the presenters sought to quantify the average impact of changes in freshmen retention on graduation rates. The specific research objective was to estimate the retention-graduation relationship at the institutional level, after attempting to control for important differences between schools via fixed effects regression analysis. The session will cover how to construct a fixed effects model, as well as the potential implications of the study results for school policy.
Room: Post Oak
Rion McDonald University of North Texas
Marisol Benitez-Ramirez Pellissippi State Community College
Facilitator: Belinda Brewster-Clemence Forsyth Technical Community College
Institutional Research / Paper
Are we measuring what we say we are measuring? The use of exploratory factor analysis in scale development.
The use of questionnaires to gather data from students across college campuses has been the mainstay in higher education. Often times these measures are developed to assess a variety of educational and psychology constructs that are latent in nature. When items are designed to measure constructs that are not directly observable, it is important to ensure that these items are indeed measuring the construct(s) of interest. The use of exploratory factor analysis allows for the determination of constructs that are being measured as well as the nature of these constructs and should be included as an important step in scale development. This is important especially since information gathered from students are often used to make predictions. An example of an exploratory factor analysis will be highlighted.
Room: Live Oak I
Danielle D. Fearon-Drake Baylor University
Facilitator: Elaine Harper University of West Georgia
Institutional Research / Paper
Why Aren’t they Coming Back?—Factors Impacting Students’ Retention
As demand grows for increased retention and graduation rates, innovative approaches have become more evident on campuses nationwide. As part of a strategic initiative designed to obtain, investigate, and analyze feedback from non-returning students, we deployed an on-line survey. Data obtained were insightful and informative. Similar to past research (Kraska, 2008), the current study found four main areas served as barriers to retuning to university studies: job and financial reasons, university-related reasons, personal and family reasons, and academic reasons.
Room: West Fork II
Dan Su Texas A&M University Commerce
Shonda Gibson Texas A&M University Commerce
Natalia Assis Texas A&M University Commerce
Facilitator: Jana Marak Baylor University
Institutional Research /
IR FROM A NEWCOMER'S PERSPECTIVE
In a position that so heavily relies on data accuracy and timely deadlines, it’s extremely important to become efficient and effective in this role. This talk will examine the practices that helped a newcomer gravitate toward a direction to meet the objectives as an Institutional Research Analyst.
Room: Red Oak
Nareiko Stephens Jefferson State Community College
Facilitator: Wendy Broyles Troy University
Planning / Work Share
Project Management and Organizational Theory for Institutional Researchers
Pressures on institutional researchers to focus on reporting requirements leave little time for pause to think strategically about long-term goals or to build meaningful relationships with key stakeholders. This session introduces IR professionals to concepts in project management that lead to quick wins and efficiency gains. Case studies from Emory Law’s student lifecycle model and Georgia Tech’s Online Master’s of Computer Science are used to demonstrate how project management tools produce results. Topics include defining scope and drafting a charter, developing workplans using Gantt charts, organizing data and personnel with process maps, and skills for presenting data as information.
Room: Bur Oak
Justin C. Shepherd Emory University School of Law
Jillian Morn Georgia Institute of Technology College of Computing
Facilitator: Patricia White Belmont University
Sponsor Session /
The "New School" Method for Unlocking Graduate Outcomes Insights
If you’re not able to provide your stakeholders with graduate data that’s real-time, longitudinal, and free from self-reporting falsehoods, consider your methods "old school". Today, IR professionals have “new school” options available to more effectively gain insight on graduate outcomes and better support on-campus demands for data. In this presentation by Equifax, attendees will learn how to revolutionize their data collection processes used for key research areas such as first destination, mid-career income and industry, and more. Best practices and mini case study examples from current Equifax Graduate Outcomes projects will also be shared during this session.
Room: West Fork I
Vince Jajuga Equifax
Meghan Solomon Equifax
Facilitator: Lisa Lord University of Louisiana at Lafayette
Technology / Work Share
Wish We Knew Then What We Know Now: Lessons Learned from a Tableau Implementation
Visual analytic tools are valuable assets in IR. Implementing these tools requires careful planning and execution with existing data structures. In this session, we discuss challenges of integrating Tableau with our Oracle-based data warehouse. We cover best practices for data connections, manipulating and extracting data for the Tableau environment, building efficient dashboards, and tips for publishing to Tableau Server and institutional webpages. We also discuss how to reconcile consumer demands with functional limitations in Tableau. Although this session is specific to Tableau, much of the lessons we learned can be generalized to the implementation process of any visual tool.
Room: Elm Fork I
Susan Moreno University of Houston
Carmen Allen University of Houston
Vyas Krishnamurthy University of Houston
Jorge Martinez University of Houston
Facilitator: Sandi Bramblett Georgia Institute of Technology