1
0
Fork 0
mirror of https://github.com/maybe-finance/maybe.git synced 2025-07-19 13:19:39 +02:00
Maybe/test/models/provider/openai_test.rb

137 lines
4.2 KiB
Ruby
Raw Normal View History

Personal finance AI (v1) (#2022) * AI sidebar * Add chat and message models with associations * Implement AI chat functionality with sidebar and messaging system - Add chat and messages controllers - Create chat and message views - Implement chat-related routes - Add message broadcasting and user interactions - Update application layout to support chat sidebar - Enhance user model with initials method * Refactor AI sidebar with enhanced chat menu and interactions - Update sidebar layout with dynamic width and improved responsiveness - Add new chat menu Stimulus controller for toggling between chat and chat list views - Improve chat list display with recent chats and empty state - Extract AI avatar to a partial for reusability - Enhance message display and interaction styling - Add more contextual buttons and interaction hints * Improve chat scroll behavior and message styling - Refactor chat scroll functionality with Stimulus controller - Optimize message scrolling in chat views - Update message styling for better visual hierarchy - Enhance chat container layout with flex and auto-scroll - Simplify message rendering across different chat views * Extract AI avatar to a shared partial for consistent styling - Refactor AI avatar rendering across chat views - Replace hardcoded avatar markup with a reusable partial - Simplify avatar display in chats and messages views * Update sidebar controller to handle right panel width dynamically - Add conditional width class for right sidebar panel - Ensure consistent sidebar toggle behavior for both left and right panels - Use specific width class for right panel (w-[375px]) * Refactor chat form and AI greeting with flexible partials - Extract message form to a reusable partial with dynamic context support - Create flexible AI greeting partial for consistent welcome messages - Simplify chat and sidebar views by leveraging new partials - Add support for different form scenarios (chat, new chat, sidebar) - Improve code modularity and reduce duplication * Add chat clearing functionality with dynamic menu options - Implement clear chat action in ChatsController - Add clear chat route to support clearing messages - Update AI sidebar with dropdown menu for chat actions - Preserve system message when clearing chat - Enhance chat interaction with new menu options * Add frontmatter to project structure documentation - Create initial frontmatter for structure.mdc file - Include description and configuration options - Prepare for potential dynamic documentation rendering * Update general project rules with additional guidelines - Add rule for using `Current.family` instead of `current_family` - Include new guidelines for testing, API routes, and solution approach - Expand project-specific rules for more consistent development practices * Add OpenAI gem and AI-friendly data representations - Add `ruby-openai` gem for AI integration - Implement `to_ai_readable_hash` methods in BalanceSheet and IncomeStatement - Include Promptable module in both models - Add savings rate calculation method in IncomeStatement - Prepare financial models for AI-powered insights and interactions * Enhance AI Financial Assistant with Advanced Querying and Debugging Capabilities - Implement comprehensive AI financial query system with function-based interactions - Add detailed debug logging for AI responses and function calls - Extend BalanceSheet and IncomeStatement models with AI-friendly methods - Create robust error handling and fallback mechanisms for AI queries - Update chat and message views to support debug mode and enhanced rendering - Add AI query routes and initial test coverage for financial assistant * Refactor AI sidebar and chat layout with improved structure and comments - Remove inline AI chat from application layout - Enhance AI sidebar with more semantic HTML structure - Add descriptive comments to clarify different sections of chat view - Improve flex layout and scrolling behavior in chat messages container - Optimize message rendering with more explicit class names and structure * Add Markdown rendering support for AI chat messages - Implement `markdown` helper method in ApplicationHelper using Redcarpet - Update message view to render AI messages with Markdown formatting - Add comprehensive Markdown rendering options (tables, code blocks, links) - Enhance AI Financial Assistant prompt to encourage Markdown usage - Remove commented Markdown CSS in Tailwind application stylesheet * Missing comma * Enhance AI response processing with chat history context * Improve AI debug logging with payload size limits and internal message flag * Enhance AI chat interaction with improved thinking indicator and scrolling behavior * Add AI consent and enable/disable functionality for AI chat * Upgrade Biome and refactor JavaScript template literals - Update @biomejs/biome to latest version with caret (^) notation - Refactor AI query and chat controllers to use template literals - Standardize npm scripts formatting in package.json * Add beta testing usage note to AI consent modal * Update test fixtures and configurations for AI chat functionality - Add family association to chat fixtures and tests - Set consistent password digest for test users - Enable AI for test users - Add OpenAI access token for test environment - Update chat and user model tests to include family context * Simplify data model and get tests passing * Remove structure.mdc from version control * Integrate AI chat styles into existing prose pattern * Match Figma design spec, implement Turbo frames and actions for chats controller * AI rules refresh * Consolidate Stimulus controllers, thinking state, controllers, and views * Naming, domain alignment * Reset migrations * Improve data model to support tool calls and message types * Tool calling tests and fixtures * Tool call implementation and test * Get assistant test working again * Test updates * Process tool calls within provider * Chat UI back to working state again * Remove stale code * Tests passing * Update openai class naming to avoid conflicts * Reconfigure test env * Rebuild gemfile * Fix naming conflicts for ChatResponse * Message styles * Use OpenAI conversation state management * Assistant function base implementation * Add back thinking messages, clean up error handling for chat * Fix sync error when security price has bad data from provider * Add balance sheet function to assistant * Add better function calling error visibility * Add income statement function * Simplify and clean up "thinking" interactions with Turbo frames * Remove stale data definitions from functions * Ensure VCR fixtures working with latest code * basic stream implementation * Get streaming working * Make AI sidebar wider when left sidebar is collapsed * Get tests working with streaming responses * Centralize provider error handling * Provider data boundaries --------- Co-authored-by: Josh Pigford <josh@joshpigford.com>
2025-03-28 13:08:22 -04:00
require "test_helper"
class Provider::OpenaiTest < ActiveSupport::TestCase
include LLMInterfaceTest
setup do
@subject = @openai = Provider::Openai.new(ENV.fetch("OPENAI_ACCESS_TOKEN", "test-openai-token"))
@subject_model = "gpt-4o"
@chat = chats(:two)
end
test "openai errors are automatically raised" do
VCR.use_cassette("openai/chat/error") do
response = @openai.chat_response(UserMessage.new(
chat: @chat,
content: "Error test",
ai_model: "invalid-model-that-will-trigger-api-error"
))
assert_not response.success?
assert_kind_of Provider::Openai::Error, response.error
end
end
test "basic chat response" do
VCR.use_cassette("openai/chat/basic_response") do
message = @chat.messages.create!(
type: "UserMessage",
content: "This is a chat test. If it's working, respond with a single word: Yes",
ai_model: @subject_model
)
response = @subject.chat_response(message)
assert response.success?
assert_equal 1, response.data.messages.size
assert_includes response.data.messages.first.content, "Yes"
end
end
test "streams basic chat response" do
VCR.use_cassette("openai/chat/basic_response") do
collected_chunks = []
mock_streamer = proc do |chunk|
collected_chunks << chunk
end
message = @chat.messages.create!(
type: "UserMessage",
content: "This is a chat test. If it's working, respond with a single word: Yes",
ai_model: @subject_model
)
@subject.chat_response(message, streamer: mock_streamer)
tool_call_chunks = collected_chunks.select { |chunk| chunk.type == "function_request" }
text_chunks = collected_chunks.select { |chunk| chunk.type == "output_text" }
response_chunks = collected_chunks.select { |chunk| chunk.type == "response" }
assert_equal 1, text_chunks.size
assert_equal 1, response_chunks.size
assert_equal 0, tool_call_chunks.size
assert_equal "Yes", text_chunks.first.data
assert_equal "Yes", response_chunks.first.data.messages.first.content
end
end
test "chat response with tool calls" do
VCR.use_cassette("openai/chat/tool_calls") do
response = @subject.chat_response(
tool_call_message,
instructions: "Use the tools available to you to answer the user's question.",
available_functions: [ PredictableToolFunction.new(@chat) ]
)
assert response.success?
assert_equal 1, response.data.functions.size
assert_equal 1, response.data.messages.size
assert_includes response.data.messages.first.content, PredictableToolFunction.expected_test_result
end
end
test "streams chat response with tool calls" do
VCR.use_cassette("openai/chat/tool_calls") do
collected_chunks = []
mock_streamer = proc do |chunk|
collected_chunks << chunk
end
@subject.chat_response(
tool_call_message,
instructions: "Use the tools available to you to answer the user's question.",
available_functions: [ PredictableToolFunction.new(@chat) ],
streamer: mock_streamer
)
text_chunks = collected_chunks.select { |chunk| chunk.type == "output_text" }
text_chunks = collected_chunks.select { |chunk| chunk.type == "output_text" }
tool_call_chunks = collected_chunks.select { |chunk| chunk.type == "function_request" }
response_chunks = collected_chunks.select { |chunk| chunk.type == "response" }
assert_equal 1, tool_call_chunks.count
assert text_chunks.count >= 1
assert_equal 1, response_chunks.count
assert_includes response_chunks.first.data.messages.first.content, PredictableToolFunction.expected_test_result
end
end
private
def tool_call_message
UserMessage.new(chat: @chat, content: "What is my net worth?", ai_model: @subject_model)
end
class PredictableToolFunction < Assistant::Function
class << self
def expected_test_result
"$124,200"
end
def name
"get_net_worth"
end
def description
"Gets user net worth data"
end
end
def call(params = {})
self.class.expected_test_result
end
end
end