PinnedAbout Me — Ram VegirajuMy Top Medium Stories — Subscribe Here & Join My NewsletterApr 21, 2022A response icon1Apr 21, 2022A response icon1
Published inAWS in Plain EnglishDocument Summarization Simplified Using Claude 3.5 SonnetTackling Popular LLM Use-Cases Utilizing Amazon BedrockSep 17, 2024A response icon1Sep 17, 2024A response icon1
Published inAWS in Plain EnglishSimplifying Triton Inference Server Configuration SetupEnabling Triton Inference Server Auto-Complete-ConfigMay 20, 2024May 20, 2024
Published inTDS ArchiveUsing Generative AI To Curate Date RecommendationsUtilizing Amazon Bedrock, Google Places, LangChain, and StreamlitMar 21, 2024Mar 21, 2024
Published inAWS in Plain EnglishDeploying Transformers ONNX Models on Amazon SageMakerAchieve High Scale Performance Utilizing Triton Inference Server With SageMaker Real-Time InferenceMar 13, 2024A response icon1Mar 13, 2024A response icon1
Published inAWS in Plain EnglishBring Your Own LLM Evaluation Algorithms to SageMaker Clarify Foundation Model EvaluationsExtend the FMEval library to incorporate your own evaluations into MLOps workflows.Mar 12, 2024Mar 12, 2024
Published inAWS in Plain EnglishImage To Text With Claude 3 SonnetExploring The New Claude Model On Amazon BedrockMar 5, 2024A response icon2Mar 5, 2024A response icon2
Published inTDS ArchiveGenerate Music Recommendations Utilizing LangChain AgentsPowered by Bedrock Claude and the Spotify APIMar 5, 2024A response icon1Mar 5, 2024A response icon1
Published inTDS ArchiveOptimized Deployment of Mistral7B on Amazon SageMaker Real-Time InferenceUtilize large model inference containers powered by DJL Serving & Nvidia TensorRTFeb 21, 2024Feb 21, 2024