Building an AI-Powered “Rizz” App: A Deep Dive with Flutter, Go, and OpenAI

The intersection of AI and personal interaction presents fascinating opportunities. Today, we’ll explore building a prototype “Rizz App” – a mobile application designed to provide text-based dating advice. This tutorial focuses on the core technical architecture, demonstrating a practical integration of Flutter, Go, and OpenAI’s API. While this is a simplified example, the principles we’ll cover are foundational for more complex AI-driven applications. This is not production ready and lacks many features needed for a consumer facing app.

Why This Tech Stack?

  • Flutter: Excellent for rapid UI development and cross-platform deployment (iOS and Android) from a single codebase. Its declarative UI paradigm and hot-reload features enhance developer productivity.
  • Go: Known for its efficiency, concurrency support, and strong standard library. Go is well-suited for building scalable and performant backend services.
  • OpenAI API: Provides access to state-of-the-art language models, enabling sophisticated text analysis and generation without requiring us to train our own models from scratch.
  • In-Memory Data Store: For this prototype, we’ll use an in-memory data store in Go to keep things simple. In a production setting, you’d replace this with a persistent database (e.g., PostgreSQL, MongoDB).

Prerequisites

  • Flutter SDK: Installed and configured.
  • Go: Installed and configured.
  • OpenAI API Key: Obtain one from the OpenAI platform (requires an account and billing setup).
  • Basic understanding of REST APIs.
  • Familiarity with Dart and Go syntax.

Project Structure

A well-organized project is crucial for maintainability. Here’s the recommended structure:

rizz_app/
├── lib/            # Flutter Frontend
│   ├── main.dart          # App Entry Point
│   ├── screens/           # UI Screens
│   │   └── home_screen.dart # Main screen with input and advice
│   └── services/        # Abstraction for API communication
│       └── api_service.dart # Handles requests to the Go backend
├── backend/          # Go Backend
│   ├── main.go          # Server entry point and routing
│   └── data/            # Data layer (in-memory for this example)
│        └── datastore.go  # OpenAI interaction and (placeholder for) data storage
├── pubspec.yaml       # Flutter Dependencies
└── README.md

Step 1: The Flutter Frontend (UI and API Interaction)

We’ll start by setting up the Flutter UI.

1.1 Dependencies (pubspec.yaml):

YAML

dependencies:
  flutter:
    sdk: flutter
  http: ^1.1.0      # For making HTTP requests
  image_picker: ^1.0.4   # For image selection
  image_cropper: ^5.0.0  # For image cropping
  cupertino_icons: ^1.0.2
  path_provider: ^2.1.1

dev_dependencies:
  flutter_test:
    sdk: flutter
  flutter_lints: ^2.0.0

Run flutter pub get to install these.

1.2 Main Entry Point (lib/main.dart):

Dart

import 'package:flutter/material.dart';
import 'package:rizz_app/screens/home_screen.dart';

void main() {
  runApp(const RizzApp());
}

class RizzApp extends StatelessWidget {
  const RizzApp({super.key});

  @override
  Widget build(BuildContext context) {
    return MaterialApp(
      title: 'Rizz App',
      theme: ThemeData(
        primarySwatch: Colors.blue,
      ),
      home: const HomeScreen(),
    );
  }
}

1.3 Home Screen (lib/screens/home_screen.dart):

This screen handles user input (text and optional image) and displays the AI’s advice. We use a StatefulWidget to manage the UI state.

Dart

import 'dart:io';
import 'package:flutter/material.dart';
import 'package:image_picker/image_picker.dart';
import 'package:image_cropper/image_cropper.dart';
import 'package:path_provider/path_provider.dart';
import 'package:rizz_app/services/api_service.dart';

class HomeScreen extends StatefulWidget {
  const HomeScreen({super.key});

  @override
  _HomeScreenState createState() => _HomeScreenState();
}

class _HomeScreenState extends State<HomeScreen> {
  final _textController = TextEditingController();
  String _advice = '';
  bool _isLoading = false;
  File? _imageFile;

  Future<void> _pickImage(ImageSource source) async {
    final pickedFile = await ImagePicker().pickImage(source: source);
    if (pickedFile != null) {
      _cropImage(File(pickedFile.path));
    }
  }
    Future<void> _cropImage(File imageFile) async {
      final directory = await getApplicationDocumentsDirectory();
      final name =  DateTime.now().millisecondsSinceEpoch.toString();
      final path = '${directory.path}/$name.jpg';
    CroppedFile? croppedFile = await ImageCropper().cropImage(
      sourcePath: imageFile.path,
      aspectRatioPresets: [
        CropAspectRatioPreset.square,
        CropAspectRatioPreset.ratio3x2,
        CropAspectRatioPreset.original,
        CropAspectRatioPreset.ratio4x3,
        CropAspectRatioPreset.ratio16x9
      ],
      uiSettings: [
        AndroidUiSettings(
            toolbarTitle: 'Cropper',
            toolbarColor: Colors.deepOrange,
            toolbarWidgetColor: Colors.white,
            initAspectRatio: CropAspectRatioPreset.original,
            lockAspectRatio: false),
        IOSUiSettings(
          title: 'Cropper',
        ),
      ],
    );
    if (croppedFile != null) {
      setState(() {
        _imageFile = File(croppedFile.path);
        //Save Image
        _imageFile?.copy(path);
      });
    }
  }

  void _getAdvice() async {
    if (_textController.text.isEmpty && _imageFile == null) {
      //Simple validation
      return;
    }
    setState(() {
      _isLoading = true;
      _advice = '';
    });

    String textInput = _textController.text;
      String? imagePath;
    if (_imageFile != null) {
      // Get the image file path
      imagePath = _imageFile!.path;
    }

    try {
      final advice = await ApiService.getAdvice(textInput, imagePath);
      setState(() {
        _advice = advice;
      });
    } catch (e) {
      setState(() {
        _advice = 'Error: Could not get advice. $e';
      });
    } finally {
      setState(() {
        _isLoading = false;
      });
    }
  }

  @override
  Widget build(BuildContext context) {
    return Scaffold(
      appBar: AppBar(
        title: const Text('Rizz Advisor'),
      ),
      body: Padding(
        padding: const EdgeInsets.all(16.0),
        child: Column(
          crossAxisAlignment: CrossAxisAlignment.stretch,
          children: [
            TextField(
              controller: _textController,
              decoration: const InputDecoration(
                labelText: 'Enter text message(s)',
                border: OutlineInputBorder(),
              ),
              maxLines: 3,
            ),
            const SizedBox(height: 10),
             Row(
              mainAxisAlignment: MainAxisAlignment.spaceEvenly,
              children: <Widget>[
                ElevatedButton.icon(
                  onPressed: () => _pickImage(ImageSource.camera),
                  icon: const Icon(Icons.camera_alt),
                  label: const Text('Camera'),
                ),
                ElevatedButton.icon(
                  onPressed: () => _pickImage(ImageSource.gallery),
                  icon: const Icon(Icons.photo_library),
                  label: const Text('Gallery'),
                ),
              ],
            ),
             if (_imageFile != null) ...[
              const SizedBox(height: 10),
              Image.file(_imageFile!), // Display the selected image.
            ],
            const SizedBox(height: 20),
            ElevatedButton(
              onPressed: _isLoading ? null : _getAdvice,
              child: _isLoading
                  ? const CircularProgressIndicator()
                  : const Text('Get Advice'),
            ),
            const SizedBox(height: 20),
            Expanded(
              child: SingleChildScrollView(
                child: Text(
                  _advice,
                  style: const TextStyle(fontSize: 16),
                ),
              ),
            ),
          ],
        ),
      ),
    );
  }

  @override
  void dispose() {
    _textController.dispose();
    super.dispose();
  }
}

1.4 API Service (lib/services/api_service.dart):

This service encapsulates the communication with the Go backend. This separation of concerns makes the code more modular and testable.

Dart

import 'dart:convert';
import 'dart:io';
import 'package:http/http.dart' as http;

class ApiService {
  static const String _baseUrl = 'http://localhost:8080'; //  Go backend URL

  static Future<String> getAdvice(String textInput, String? imagePath) async {
    final url = Uri.parse('$_baseUrl/advice');
    final Map<String, String> headers = {
      'Content-Type': 'application/json',
    };
    // Prepare the body, including the image path if available
    final body = jsonEncode({
      'text': textInput,
      'imagePath': imagePath, // This will be null if no image is selected
    });

    final response = await http.post(url, headers: headers, body: body);

    if (response.statusCode == 200) {
      final Map<String, dynamic> data = json.decode(response.body);
      return data['advice'];
    } else {
      throw Exception('Failed to get advice: ${response.statusCode}');
    }
  }
}

Step 2: The Go Backend (API and OpenAI Integration)

The Go backend is responsible for receiving requests from the Flutter app, interacting with the OpenAI API, and returning advice.

2.1 Main Server (backend/main.go):

Go

package main

import (
    "encoding/json"
    "fmt"
    "io"
    "log"
    "net/http"
    "os"
    "strings"

    "rizz_app/data"

    "github.com/joho/godotenv"
)

// RequestData defines the structure of the incoming JSON request.
type RequestData struct {
    Text      string `json:"text"`
    ImagePath string `json:"imagePath"` // Path to the image (if provided)
}

// ResponseData defines the structure of the JSON response.
type ResponseData struct {
    Advice string `json:"advice"`
}

// adviceHandler handles requests to the /advice endpoint.
func adviceHandler(w http.ResponseWriter, r *http.Request) {
    // Enable CORS for local development.  Remove or configure properly for production.
    w.Header().Set("Access-Control-Allow-Origin", "*")
    w.Header().Set("Access-Control-Allow-Headers", "Content-Type")
    // Handle preflight requests for CORS.
    if r.Method == "OPTIONS" {
        w.WriteHeader(http.StatusOK)
        return
    }

    // Decode the JSON request body.
    var requestData RequestData
    decoder := json.NewDecoder(r.Body)
    if err := decoder.Decode(&requestData); err != nil {
        http.Error(w, "Invalid request body", http.StatusBadRequest)
        return
    }

    // Process the image if a path is provided.
    var combinedInput string
    if requestData.ImagePath != "" {
        //Correct the Image Path
        imagePath := strings.ReplaceAll(requestData.ImagePath, "\\", "/")

        file, err := os.Open(imagePath)
        if err != nil {
            http.Error(w, "Failed to open image file", http.StatusInternalServerError)
            return
        }
        defer file.Close()
        imageBytes, err := io.ReadAll(file)
        if err != nil {
            http.Error(w, "Failed to read the image", http.StatusInternalServerError)
            return
        }

        // Convert image bytes to base64 string (or handle as needed)
        // base64Image := base64.StdEncoding.EncodeToString(imageBytes)
        imageData := string(imageBytes)

        // Combine image data with text input
        combinedInput = fmt.Sprintf("Image Data Length: %d, Text: %s", len(imageData), requestData.Text)
    } else {
        combinedInput = requestData.Text
    }

    // Get advice from the OpenAI API (using our data layer).
    advice, err := data.GetOpenAIResponse(combinedInput)
    if err != nil {
        http.Error(w, "Failed to get AI advice", http.StatusInternalServerError)
        return
    }

    // Construct the response and send it back as JSON.
    responseData := ResponseData{Advice: advice}
    w.Header().Set("Content-Type", "application/json")
    json.NewEncoder(w).Encode(responseData)
}

func main() {
    // Load environment variables from a .env file.
    err := godotenv.Load()
    if err != nil {
        log.Fatal("Error loading .env file")
    }
    // Check if the required environment variable is set
    openAIKey := os.Getenv("OPENAI_API_KEY")
    if openAIKey == "" {
        log.Fatal("OPENAI_API_KEY environment variable not set")
    }
    // Initialize the data store with the OpenAI API key
    data.InitDataStore(openAIKey)

    // Set up the HTTP route handler.
    http.HandleFunc("/advice", adviceHandler)

    // Start the server.
    fmt.Println("Server listening on port 8080")
    log.Fatal(http.ListenAndServe(":8080", nil))
}

2.2 Data Layer (backend/data/datastore.go):

This module handles the interaction with the OpenAI API. It’s designed to be easily replaceable with a different data source or AI model.

Go

package data

import (
    "context"
    "fmt"
    "log"
    "strings"

    "github.com/sashabaranov/go-openai"
)

var client *openai.Client

// InitDataStore initializes the data store with the OpenAI API key.
func InitDataStore(apiKey string) {
    client = openai.NewClient(apiKey)
}
// GetOpenAIResponse gets advice from the OpenAI API based on the input.
func GetOpenAIResponse(input string) (string, error) {
    // Check if the client is initialized
    if client == nil {
        log.Fatal("OpenAI client not initialized")
    }
    prompt := fmt.Sprintf("Give dating advice based on this context, make sure it is only a sentence or two: %s", input)
    resp, err := client.CreateChatCompletion(
        context.Background(),
        openai.ChatCompletionRequest{
            Model: openai.GPT3Dot5Turbo,
            Messages: []openai.ChatCompletionMessage{
                {
                    Role:    openai.ChatMessageRoleUser,
                    Content: prompt,
                },
            },
        },
    )

    if err != nil {
        fmt.Printf("ChatCompletion error: %v\n", err)
        return "", err
    }
    //Remove new line characters
    advice := strings.ReplaceAll(resp.Choices[0].Message.Content, "\n", "")

    return advice, nil
}

2.3 Environment Variables (backend/.env):

Store your OpenAI API key securely in a .env file:

OPENAI_API_KEY=YOUR_ACTUAL_OPENAI_API_KEY

Step 3: Running the Application

  1. Start the Backend: In the backend directory, run go run main.go.
  2. Start the Frontend: In the root project directory (rizz_app), run flutter run.

Key Architectural Considerations

  • Separation of Concerns: The code is divided into distinct modules (UI, API service, data layer) for better organization, testability, and maintainability.
  • Abstraction: The api_service.dart file provides an abstraction layer over the HTTP requests, making it easier to switch to a different backend implementation if needed.
  • Error Handling: Basic error handling is included, but a production application would require much more robust error handling and logging.
  • Asynchronous Operations: Flutter’s Future and async/await are used to handle asynchronous operations (network requests) without blocking the UI thread.
  • CORS: The backend includes basic CORS handling, which is necessary for the Flutter web app to communicate with the Go server running on a different port. *Crucially, this is a development-

Leave a Reply