American Psychological Association (APA): An advanced generative AI platform for faster research

Category: Healthcare, Education

Services: Gen AI Development, Architecture Design and Review, Managed Engineering Teams

APA - diagram
  • 80% increase in the knowledge discovery speed.
  • 70% improvement in AI model accuracy.
  • 80% reduction in access time to mental health support

About APA

American Psychological Association (APA) is the leading scientific and professional organization representing psychology in the United States, with more than 157,000 researchers, educators, clinicians, consultants, and students as its members. The research organization needed to create a generative AI platform for its researchers to search for resources across multiple sources. The platform needed to be accurate and support multiple languages enabling researchers to find any information they need from the APA knowledge base.

Challenge

  • APA must build a cognitive search for its web portal for researchers to facilitate seamless knowledge discovery.
  • APA needed a generative AI platform to enable contextual conversations and follow-ups for researchers across the research community.
  • The generative AI platform must also support researchers’ input in different languages yet provide contextually accurate data.
  • APA needs a solution that can save time for researchers to search data across their database.
  • Building a new AI model and training it with the existing and new data was challenging for APA.
  • Further, the research organization also needed to deploy and integrate AI models into their existing applications.

Solution

  • Simform leveraged AWS expertise to build generative AI capabilities for APA through Amazon Bedrock and SageMaker.
  • First, we helped APA select the suitable LLM model and helped train it using responses fetched from various APA data sources.
  • Fetched data was stored on Amazon S3 to create a knowledge base that stored several XML files on articles.
  • Further, we leveraged Amazon SageMaker to deploy Llama LLMs and AWS Fargate to scale the application through a containerized platform.
  • Amazon ECS helped our experts manage and orchestrate containerized workloads. 20
  • We used Amazon CloudWatch to offer better application performance visibility and enhance platform monitoring.
  • Our team also leveraged AWS OpenSearch as a vector database and stored the user feedback, which helped train the AI model.
  • We also embedded the APA knowledge base into the platform, leveraging Amazon Titan Multimodal Embeddings.
  • To deploy the AI model effectively, we leveraged AWS CodePipeline, automating the build and deployment process relying on AWS CodeBuild for code compilation, running tests, and creating release packages.
  • Further, we also offered continuous improvement for the generative AI platform through fully managed CI/CD using AWS Amplify.
  • Once deployed, we secured the application from online threats through AWS WAF.

Outcome

  • Amazon Bedrock and SageMaker helped build advanced generative AI capabilities, enabling contextual conversations to improve knowledge discovery speed by 80%.
  • We used Amazon SageMaker and AWS Fargate to improve AI model accuracy by 70%.
  • Using AWS WAF for security helped us improve the security and data quality of the generative AI platform by 65%.
  • Achieved 35% improvement in contextual relevance in generative AI-based search system leveraging Amazon Titan Multimodal Embeddings.

Arhitecture Diagram

APA - diagram

AWS Services

  • WS BedRock LLM Models- We leveraged Amazon Bedrock capabilities with few-shot learning and prompt engineering to train models from fetched data across different sources.
  • Amazon S3- The data fetched from different sources was stored on Amazon S3 to create a knowledge base that stored several XML files on articles.
  • Amazon SageMaker- We leveraged Amazon SageMaker to deploy Llama LLM models.
  • AWS Fargate-Our team leveraged AWS Fargate to scale the application through a containerized platform.
  • Amazon ECS- We used Amazon ECS to manage and orchestrate containerized workloads.
  • Amazon CloudWatch- We used Amazon CloudWatch to improve application performance and enhance platform monitoring.
  • AWS OpenSearch- Our team usedAWS OpenSearch as a vector database to store user feedback and help train the AI model
  • Amazon Titan Multimodal Embeddings-We embedded the APA knowledge base in the platform using Amazon Titan Multimodal Embeddings.
  • AWS CodePipeline-We automated the build and deployment process using AWS CodePipeline 
  • AWS CodeBuild- Our team used AWS CodeBuild for code compilation, running tests, and creating release packages.
  • AWS Amplify- We enabled continuous improvement for the generative AI platform using  AWS Amplify to manage CI/CD implementations.
  • AWS WAF-We used AWS WAF to secure the platform from cyber threats and online cyberattacks.

Related Case Studies

ONA dating - case study
Freewire - case study

Speak to our experts to unlock the value of Cloud!