Skip to main content
  • UQ Home
  • Contacts
  • Study
  • Maps
  • News
  • Events
  • Library
  • my.UQ
The University of Queensland

Research Computing CentreResearch Computing Centre

Site search
Homepage Site menu
  • Home
  • Core Technologies
    • High Performance Computing
    • Cloud computing
    • Workflow
      • Nimrod
      • Kepler
      • Galaxy
        • Genomics Virtual Lab
    • Data Management
    • Visualisation
  • Infrastructure
    • High Performance Computing
      • Awoonga
      • FlashLite
      • Tinaroo
      • Wiener
      • RCC Active Changes (UQ only)
      • RCC Active Incidents (UQ only)
      • RCC User Guides (UQ only)
      • External resources
    • Cloud computing
    • Data storage
  • Themes
  • Media & Events
    • News
    • Newsletters
    • Events
      • Upcoming seminars
    • RCC seminars
    • RCC presentations
    • Media mentions
  • About
    • RCC's Mission
    • Research impact
    • The RCC team
    • Governance
    • RCC Fellows
    • Student internships in the RCC
    • QURPA
      • QURPA trip report 2020—Edward Davis
      • QURPA trip report 2020—Aviral Kailash Jain
      • QURPA Trip Report 2019
      • QURPA Trip Report 2016
      • QURPA Trip Reports 2015
      • QURPA 2020: Using machine learning to detect falls
    • Acknowledging RCC
  • Contact
  • Support
    • Training
    • Meetups
    • Support Desk
    • Service Status
  • FAQS

Analysing Big Data using Workflows: from fighting wildfires to helping patients

23 September 2016
9:00am to 10:00am
Room 505A, Axon Building 47, The University of Queensland (St Lucia)

Speaker:

Dr Ilkay Altintas, Chief Data Science Officer, San Diego Supercomputer Center, University of California San Diego
 

Abstract:

There is a growing plethora of applications that require processing of streaming data, often defined as Big Data due to the volume, velocity and/or variety of the data to be processed. The need for dynamic capabilities in computing is increasing more than ever with the influence of such applications.

Over the last decade, scientific workflows and dataflow systems have emerged as a successful model for big data processing, especially in scenarios where a scalable and reusable integration of streaming data, analytical tools and computational infrastructure is needed.

Emerging heterogeneous computing architectures and cloud technologies are enabling workflows to be utilised as a scalable and reproducible programming model for data streaming and steering within dynamic data-driven applications.

This talk will summarise varying and changing requirements for scalability in distributed workflows influenced by Big Data and heterogeneous computing architectures including our ongoing research efforts on end-to-end performance prediction and scheduling for workflow-driven applications. 

 

Biography:

Dr Ilkay Altintas is the Chief Data Science Officer at the San Diego Supercomputer Center (SDSC), University of California San Diego, where she is also the Founder and Director for the Workflows for Data Science Center of Excellence.

Since joining SDSC in 2001, she has worked on different aspects of scientific workflows as a principal investigator and in other leadership roles across a wide range of cross-disciplinary projects.

She is a co-initiator of and an active contributor to the open-source Kepler Scientific Workflow System, and the co-author of publications related to computational data science at the intersection of scientific workflows, provenance, distributed computing and big data with applications to many scientific domains.

© The University of Queensland
Enquiries: +61 7 3365 1111   |   Contact directory
ABN: 63 942 912 684   |   CRICOS Provider No: 00025B
Emergency
Phone: 3365 3333
Privacy & Terms of use   |   Feedback   |   Updated: 27 Sep 2016
Login