Blog

  • JNIgen: Simplify Native Integration in Flutter

    Prepare to embark on a groundbreaking journey through the realms of Flutter as we uncover the remarkable new feature—JNIgen. In this blog, we pull back the curtain to reveal JNIgen’s transformative power, from simplifying intricate tasks to amplifying scalability; this blog serves as a guiding light along the path to a seamlessly integrated Flutter ecosystem.

    As Flutter continues to mesmerize developers with its constant evolution, each release unveiling a treasure trove of thrilling new features, the highly anticipated Google I/O 2023 was an extraordinary milestone. Amidst the excitement, a groundbreaking technique was unveiled: JNIgen, offering effortless access to native code like never before.

    Let this blog guide you towards a future where your Flutter projects transcend limitations and manifest into awe-inspiring creations.

    1. What is JNIgen?

    JNIgen, which stands for Java native interface generator,  is an innovative tool that automates the process of generating Dart bindings for Android APIs accessible through Java or Kotlin code. By utilizing these generated bindings, developers can invoke Android APIs with a syntax that closely resembles native code.

    With JNIgen, developers can seamlessly bridge the gap between Dart and the rich ecosystem of Android APIs. This empowers them to leverage the full spectrum of Android’s functionality, ranging from system-level operations to platform-specific features. By effortlessly integrating with Android APIs through JNIgen-generated bindings, developers can harness the power of native code and build robust applications with ease.

    1.1. Default approach: 

    In the current Flutter framework, we rely on Platform channels to establish a seamless communication channel between Dart code and native code. These channels serve as a bridge for exchanging messages and data.

    Typically, we have a Flutter app acting as the client, while the native code contains the desired methods to be executed. The Flutter app sends a message containing the method name to the native code, which then executes the requested method and sends the response back to the Flutter app.

    However, this approach requires the manual implementation of handlers on both the Dart and native code sides. It entails writing code to handle method calls and manage the exchange of responses. Additionally, developers need to carefully manage method names and channel names on both sides to ensure proper communication.

    1.2. Working principle of JNIgen: 

    Figure 1

     

    In JNIgen, our native code path is passed to the JNIgen generator, which initiates the generation of an intermediate layer of C code. This C code is followed by the necessary boilerplate in Dart, facilitating access to the C methods. All data binding and C files are automatically generated in the directory specified in the .yaml file, which we will explore shortly.

    Consequently, as a Flutter application, our interaction is solely focused on interfacing with the newly generated Dart code, eliminating the need for direct utilization of native code.

    1.3. Similar tools: 

    During the Google I/O 2023 event, JNIgen was introduced as a tool for native code integration. However, it is important to note that not all external libraries available on www.pub.dev are developed exclusively using channels. Another tool, FFIgen, was introduced earlier at Google I/O 2021 and serves a similar purpose. Both FFIgen and JNIgen function similarly, converting native code into intermediate C code with corresponding Dart dependencies to establish the necessary connections.

    While JNIgen primarily facilitates communication between Android native code and Dart code, FFIgen has become the preferred choice for establishing communication between iOS native code and Dart code. Both tools are specifically designed to convert native code into intermediate code, enabling seamless interoperability within their respective platforms.

    2. Configuration

    Prior to proceeding with the code implementation, it is essential to set up and install the necessary tools.

    2.1. System setup: 

    2.1.1 Install MVN

    Windows

    • Download the Maven archive for Windows from the link here [download Binary zip archive]
    • After Extracting the zip file, you will get a folder with name “apache-maven-x.x.x”
    • Create a new folder with the name “ApacheMaven” in “C:Program Files” and paste the above folder in it. [Your current path will be “C:Program FilesApacheMavenapache-maven-x.x.x”]
    • Add the following entry in “Environment Variable” →  “User Variables”
      M2 ⇒ “C:Program FilesApacheMavenapache-maven-x.x.xbin”
      M2_HOME ⇒ “C:Program FilesApacheMavenapache-maven-x.x.x”
    • Add a new entry “%M2_HOME%bin” in “path” variable

    Mac

    • Download Maven archive for mac from the link here [download Binary tar.gz archive]
    • Run the following command where you have downloaded the *.tar.gz file
    tar -xvf apache-maven-3.8.7.bin.tar.gz

    • Add the following entry in .zshrc or .bash_profile to set Maven path “export PATH=”$PATH:/Users/username/Downloads/apache-maven-x.x.x/bin”

    Or

    • You can use brew to install llvm 
    brew install llvm

    • Brew will give you instruction like this for further setup
    ==> llvm
    To use the bundled libc++ please add the following LDFLAGS:
    LDFLAGS="-L/opt/homebrew/opt/llvm/lib/c++ -Wl,-rpath,/opt/homebrew/opt/llvm/lib/c++"
    
    llvm is keg-only, which means it was not symlinked into /opt/homebrew,
    because macOS already provides this software and installing another version in
    parallel can cause all kinds of trouble.
    
    If you need to have llvm first in your PATH, run:
    echo 'export PATH="/opt/homebrew/opt/llvm/bin:$PATH"' >> ~/.zshrc
    
    For compilers to find llvm you may need to set:
    export LDFLAGS="-L/opt/homebrew/opt/llvm/lib"
    export CPPFLAGS="-I/opt/homebrew/opt/llvm/include"

    2.1.1 Install Clang-Format

    Windows

    • Download the latest version of LLVM for windows from the link here

    Mac

    • Run the following brew command: 
    brew install clang-format

    2.2. Flutter setup: 

    2.2.1 Get Dependencies

    Run the following commands with Flutter:

    flutter pub add jni

    flutter pub add jnigen

    2.2.2 Setup configuration file

    Figure 01 provides a visual representation of the .yaml file, which holds crucial configurations utilized by JNIgen. These configurations serve the purpose of identifying paths for native classes, as well as specifying the locations where JNIgen generates the resulting C and Dart files. Furthermore, the .yaml file allows for specifying Maven configurations, enabling the selection of specific third-party libraries that need to be downloaded to facilitate code generation.

    By leveraging the power of the .yaml file, developers gain control over the path identification process and ensure that the generated code is placed in the desired locations. Additionally, the ability to define Maven configurations grants flexibility in managing dependencies, allowing the seamless integration of required third-party libraries into the generated code. This comprehensive approach enables precise control and customization over the code generation process, enhancing the overall efficiency and effectiveness of the development workflow.

    Let’s explore the properties that we have utilized within the .yaml file (Please refer “3.2.2. code implementation” section’s example for better understanding):

    • android_sdk_config: 

    When the value of a specific property is set to “true,” it triggers the execution of a Gradle stub during the invocation of JNIgen. Additionally, it includes the Android compile classpath in the classpath of JNIgen. However, to ensure that all dependencies are cached appropriately, it is necessary to have previously performed a release build.

    • output 

    As the name implies, the “output” section defines the configuration related to the generation of intermediate code. This section plays a crucial role in determining how the intermediate code will be generated and organized.

    •  c >> library_name &&  c >> path:
      Here we are setting details for c_based binding code.

    •  dart >> path &&  dart >> structure:

    Here we are defining configuration for dart_based binding code.

    •  source_path:

    These are specific directories that are scanned during the process of locating the relevant source files.

    •  classes:

    By providing a comprehensive list of classes or packages, developers can effectively control the scope of the code generation process. This ensures that the binding code is generated only for the desired components, minimizing unnecessary code generation

    By utilizing these properties within the .yaml file, developers can effectively control various aspects of the code generation process, including path identification, code organization, and dependency management. To get more in-depth information, please check out the official documentation here.

    2.3. Generate bindings files:

    Once this setup is complete, the final step for JNIgen is to obtain the jar file that will be scanned to generate the required bindings. To initiate the process of generating the Android APK, you can execute the following command:

    flutter build apk

    Run the following command in your terminal to generate code:

    dart run jnigen --config jnigen.yaml

    2.3. Android setup: 

    Add the address of CMakeLists.txt file in your android >> app >> build.gradle file’s buildTypes section:

    buildTypes {
            externalNativeBuild {
                cmake {
                    path <address of CMakeLists.txt>
                }
            }
        }

    With this configuration, we are specifying the path for the CMake file that will been generated by JNIgen.This path declaration is crucial for identifying the location of the generated CMake file within the project structure.

    With the completion of the aforementioned steps, you are now ready to run your application and leverage all the native functions that have been integrated.

    3. Sample Project

    To gain hands-on experience and better understand the JNIgen, let’s create a small project together. Follow the steps below to get started. 

    Let’s start with:

    3.1. Packages & directories:

    3.1.1 Create a project using the following command:

    flutter create jnigen_integration_project

    3.1.2 Add these under dependencies of pubspec.yaml (and run command flutter pub get):

    jni: ^0.5.0
    jnigen: ^0.5.0

    3.1.3. Got to android >> app >> src >> main directory.

    3.1.4. Create directories inside the main as show below:

    Figure 02 

    3.2. Code Implementation:

    3.2.1 We will start with Android code. Create 2 files HardwareUtils.java & HardwareUtilsKotlin.kt inside the utils directory.

     HardwareUtilsKotlin.kt

    package com.hardware.utils
    
    import android.os.Build
    
    class HardwareUtilsKotlin {
    
       fun getHardwareDetails(): Map<String, String>? {
           val hardwareDetails: MutableMap<String, String> = HashMap()
           hardwareDetails["Language"] = "Kotlin"
           hardwareDetails["Manufacture"] = Build.MANUFACTURER
           hardwareDetails["Model No."] = Build.MODEL
           hardwareDetails["Type"] = Build.TYPE
           hardwareDetails["User"] = Build.USER
           hardwareDetails["SDK"] = Build.VERSION.SDK
           hardwareDetails["Board"] = Build.BOARD
           hardwareDetails["Version Code"] = Build.VERSION.RELEASE
           return hardwareDetails
       }
    }

     HardwareUtils.java 

    package com.hardware.utils;
    
    
    import android.os.Build;
    
    
    import java.util.HashMap;
    import java.util.Map;
    
    
    public class HardwareUtils {
    
    
       public Map<String, String> getHardwareDetails() {
           Map<String, String> hardwareDetails = new HashMap<String, String>();
           hardwareDetails.put("Language", "JAVA");
           hardwareDetails.put("Manufacture", Build.MANUFACTURER);
           hardwareDetails.put("Model No.", Build.MODEL);
           hardwareDetails.put("Type", Build.TYPE);
           hardwareDetails.put("User", Build.USER);
           hardwareDetails.put("SDK", Build.VERSION.SDK);
           hardwareDetails.put("Board", Build.BOARD);
           hardwareDetails.put("Version Code", Build.VERSION.RELEASE);
           return hardwareDetails;
       }
    
    
       public Map<String, String> getHardwareDetailsKotlin() {
           return new HardwareUtilsKotlin().getHardwareDetails();
       }
    
    
    }

    3.2.2 To provide the necessary configurations to JNIGen for code generation, we will create a .yaml file named JNIgen.yaml in the root of the project.

       jnigen.yaml 

    android_sdk_config:
     add_gradle_deps: true
    
    
    output:
     c:
       library_name: hardware_utils
       path: src/
     dart:
       path: lib/hardware_utils.dart
       structure: single_file
    
    
    source_path:
     - 'android/app/src/main/java'
    
    
    classes:
     - 'com.hardware.utils'

    3.2.3 Let’s generate C & Dart code.

    Execute the following command to create APK:

    flutter build apk

    After the successful execution of the above command, execute the following command:

    dart run jnigen --config jnigen.yaml

    3.2.4 Add the address of CMakeLists.txt in your android >> app >> build.gradle file’s buildTypes section as shown below :

    buildTypes {
            externalNativeBuild {
                cmake {
                    path "../../src/CMakeLists.txt"
                }
            }
      }

    3.2.5. Final step is to call the methods from Dart code, which was generated by JNIgen.

    To do this, replace the MyHomePage class code with the below code from main.dart file.

    class MyHomePage extends StatefulWidget {
     const MyHomePage({super.key, required this.title});
    
     final String title;
    
     @override
     State<MyHomePage> createState() => _MyHomePageState();
    }
    
    class _MyHomePageState extends State<MyHomePage> {
     String _hardwareDetails = '';
     String _hardwareDetailsKotlin = '';
     JObject activity = JObject.fromRef(Jni.getCurrentActivity());
    
     @override
     void initState() {
       JMap<JString, JString> deviceHardwareDetails =
           HardwareUtils().getHardwareDetails();
       _hardwareDetails = 'This device details from Java class:n';
       deviceHardwareDetails.forEach((key, value) {
         _hardwareDetails =
             '$_hardwareDetailsn${key.toDartString()} is ${value.toDartString()}';
       });
    
       JMap<JString, JString> deviceHardwareDetailsKotlin =
           HardwareUtils().getHardwareDetailsKotlin();
       _hardwareDetailsKotlin = 'This device details from Kotlin class:n';
       deviceHardwareDetailsKotlin.forEach((key, value) {
         _hardwareDetailsKotlin =
             '$_hardwareDetailsKotlinn${key.toDartString()} is ${value.toDartString()}';
       });
    
       setState(() {
         _hardwareDetails;
         _hardwareDetailsKotlin;
       });
       super.initState();
     }
    
     @override
     Widget build(BuildContext context) {
       return Scaffold(
         appBar: AppBar(
           title: Text(widget.title),
         ),
         body: Center(
           child: Column(
             mainAxisAlignment: MainAxisAlignment.center,
             children: <Widget>[
               Text(
                 _hardwareDetails,
                 textAlign: TextAlign.center,
               ),
               SizedBox(height: 20,),
               Text(
                 _hardwareDetailsKotlin,
                 textAlign: TextAlign.center,
               ),
             ],
           ),
         ),
       );
     }
    }

    After all of this, when we launch our app, we will see information about our Android device.

    4. Result

    For your convenience, the complete code for the project can be found here. Feel free to refer to this code repository for a comprehensive overview of the implementation details and to access the entirety of the source code.

    5. Conclusion

    In conclusion, we explored the limitations of the traditional approach to native API access in Flutter for mid to large-scale projects. Through our insightful exploration of JNIgen’s working principles, we uncovered its remarkable potential for simplifying the native integration process.

    By gaining a deep understanding of JNIgen’s inner workings, we successfully developed a sample project and provided detailed guidance on the essential setup requirements. Armed with this knowledge, developers can embrace JNIgen’s capabilities to streamline their native integration process effectively.

    We can say that JNIgen is a valuable tool for Flutter developers seeking to combine the power of Flutter’s cross-platform capabilities with the flexibility and performance benefits offered by native code. It empowers developers to build high-quality apps that seamlessly integrate platform-specific features and existing native code libraries, ultimately enhancing the overall user experience. 

    Hopefully, this blog post has inspired you to explore the immense potential of JNIgen in your Flutter applications. By harnessing the JNIgen, we can open doors to new possibilities.

    Thank you for taking the time to read through this blog!

    6. Reference

    1. https://docs.flutter.dev/
    2. https://pub.dev/packages/jnigen
    3. https://pub.dev/packages/jni
    4. https://github.com/dart-lang/jnigen
    5. https://github.com/dart-lang/jnigen#readme
    6. https://github.com/dart-lang/jnigen/wiki/Architecture-&-Design-Notes
    7. https://medium.com/simform-engineering/jnigen-an-easy-way-to-access-platform-apis-cb1fd3101e33
    8. https://medium.com/@marcoedomingos/the-ultimate-showdown-methodchannel-vs-d83135f2392d
  • Intelligent OCR bots driving efficiency and cutting costs

    R Systems’ AI-driven OCR Bots revolutionize customer onboarding for an insurance company, automating data processing, enhancing accuracy, and lowering operational costs.

  • Developed cognitive RPA bot to reduce manual efforts

    R Systems empowers a global financial service provider with an innovative cognitive RPA solution, streamlining mortgage operations and reducing manual efforts.

  • Powering rapid invoice management with automation

    R Systems automated the entire billing process, enabling a leading healthcare services provider to handle multiple invoices within a few minutes seamlessly.

  • Digital workforce deployed to cut processing time and costs by 50%

    R Systems deployed a digital workforce and automated scrutinizing of new proposals, delivering over 50% reduction in operational costs and processing time for an insurance company.

  • Serverpod: The Ultimate Backend for Flutter

    Join us on this exhilarating journey, where we bridge the gap between frontend and backend development with the seamless integration of Serverpod and Flutter.

    Gone are the days of relying on different programming languages for frontend and backend development. With Flutter’s versatile framework, you can effortlessly create stunning user interfaces for a myriad of platforms. However, the missing piece has always been the ability to build the backend in Dart as well—until now.

    Introducing Serverpod, the missing link that completes the Flutter ecosystem. Now, with Serverpod, you can develop your entire application, from frontend to backend, all within the familiar and elegant Dart language. This synergy enables a seamless exchange of data and functions between the client and the server, reducing development complexities and boosting productivity.

    1. What is Serverpod?

    As a developer or tech enthusiast, we recognize the critical role backend services play in the success of any application. Whether you’re building a web, mobile, or desktop project, a robust backend infrastructure is the backbone that ensures seamless functionality and scalability.

    That’s where “Serverpod” comes into the picture—an innovative backend solution developed entirely in Dart, just like your Flutter projects. With Serverpod at your disposal, you can harness the full power of Dart on both the frontend and backend, creating a harmonious development environment that streamlines your workflow.

    The biggest advantage of using Serverpod is that it automates protocol and client-side code generation by analyzing your server, making remote endpoint calls as simple as local method calls.

    1.1. Current market status

    The top 10 programming languages for backend development in 2023 are as follows: 

    [Note: The results presented here are not absolute and are based on a combination of surveys conducted in 2023, including ‘Stack Overflow Developer Survey – 2023,’ ‘State of the Developer Ecosystem Survey,’ ‘New Stack Developer Survey,’ and more.]

    • Node.js – ~32%
    • Python (Django, Flask) – ~28%
    • Java (Spring Boot, Java EE) – ~18%
    • Ruby (Ruby on Rails) – ~7%
    • PHP (Laravel, Symfony) – ~6%
    • Go (Golang) – ~3%
    • .NET (C#) – ~2%
    • Rust – Approximately 1%
    • Kotlin (Spring Boot with Kotlin) – ~1%
    • Express.js (for Node.js) – ~1%
    Figure 01

    Figure 01 provides a comprehensive overview of the current usage of backend development technologies, showcasing a plethora of options with diverse features and capabilities. However, the landscape takes a different turn when it comes to frontend development. While the backend technologies offer a wealth of choices, most of these languages lack native multiplatform support for frontend applications.

    As a result, developers find themselves in a situation where they must choose between two sets of languages or technologies for backend and frontend business logic development.

    1.2. New solution

    As the demand for multiplatform applications continues to grow, developers are actively exploring new frameworks and languages that bridge the gap between backend and frontend development. Recently, a groundbreaking solution has emerged in the form of Serverpod. With Serverpod, developers can now accomplish server development in Dart, filling the crucial gap that was previously missing in the Flutter ecosystem.

    Flutter has already demonstrated its remarkable support for a wide range of platforms. The absence of server development capabilities was a notable limitation that has now been triumphantly addressed with the introduction of Serverpod. This remarkable achievement enables developers to harness the power of Dart to build both frontend and backend components, creating unified applications with a shared codebase.

    2. Configurations 

    Prior to proceeding with the code implementation, it is essential to set up and install the necessary tools.

    [Note: Given Serverpod’s initial stage, encountering errors without readily available online solutions is plausible. In such instances, seeking assistance from the Flutter community forum is highly recommended. Drawing from my experience, I suggest running the application on Flutter web first, particularly for Serverpod version 1.1.1, to ensure a smoother development process and gain insights into potential challenges.]

    2.1. Initial setup

    2.1.1 Install Docker

    Docker serves a crucial role in Serverpod, facilitating:

    • Containerization: Applications are packaged and shipped as containers, enabling seamless deployment and execution across diverse infrastructures.
    • Isolation: Applications are isolated from one another, enhancing both security and performance aspects, safeguarding against potential vulnerabilities, and optimizing system efficiency.

    Download & Install Docker from here.

    2.1.2 Install Serverpod CLI 

    • Run the following command:
    dart pub global activate serverpod_cli

    • Now test the installation by running:
    serverpod

    With proper configuration, the Serverpod command displays help information.

    2.2. Project creation

    To initiate serverpod commands, the Docker application must be launched first. Ensuring an active Docker instance in the backend environment is imperative to execute Serverpod commands successfully.

    • Create a new project with the command:
    serverpod create <your_project_name>

    Upon execution, a new directory will be generated with the specified project name, comprising three Dart packages:

    <your_project_name>_server: This package is designated for server-side code, encompassing essential components such as business logic, API endpoints, DB connections, and more.
    <your_project_name>_client: Within this package, the code responsible for server communication is auto-generated. Manual editing of files in this package is typically avoided.
    <your_project_name>_flutter: Representing the Flutter app, it comes pre-configured to seamlessly connect with your local server, ensuring seamless communication between frontend and backend elements.

    2.3. Project execution

    Step 1: Navigate to the server package with the following command:

    cd <your_project_name>/<your_project_name>_server

    Step 2: (Optional) Open the project in the VS Code IDE using the command:

    (Note: You can use any IDE you prefer, but for our purposes, we’ll use VS Code, which also simplifies DB connection later.)

    code .

    Step 3: Once the project is open in the IDE, stop any existing Docker containers with this command:

    .setup-tables.cmd

    Step 4: Before starting the server, initiate new Docker containers with the following command:

    docker-compose up --build --detach

    Step 5: The command above will start PostgreSQL and Redis containers, and you should receive the output:

    ~> docker-compose up --build --detach
    	[+] Running 2/2
     	✔ Container <your_project_name>_server-redis-1     Started                                                                                                
     	✔ Container <your_project_name>_server-postgres-1  Started

    (Note: If the output doesn’t match, refer to this Stack Overflow link for missing commands in the official documentation.)

    Step 6: Proceed to start the server with this command:

    dart bin/main.dart

    Step 7: Upon successful execution, you will receive the following output, where the “Server Default listening on port” value is crucial. Please take note of this value.

    ~> dart bin/main.dart
     	SERVERPOD version: 1.1.1, dart: 3.0.5 (stable) (Mon Jun 12 18:31:49 2023 +0000) on "windows_x64", time: 2023-07-19 15:24:27.704037Z
     	mode: development, role: monolith, logging: normal, serverId: default
     	Insights listening on port 8081
     	Server default listening on port 8080
     	Webserver listening on port 8082
     	CPU and memory usage metrics are not supported on this platform.

    Step 8: Use the “Server Default listening on port” value after “localhost,” i.e., “127.0.0.1,” and load this URL in your browser. Accessing “localhost:8080” will display the desired output, indicating that your server is running and ready to process requests.

    Figure 02

    Step 9: Now, as the containers reach the “Started” state, you can establish a connection with the database. We have opted for PostgreSQL as our DB choice, and the rationale behind this selection lies in the “docker-compose.yaml” file at the server project’s root. In the “service” section, PostgreSQL is already added, making it an ideal choice as the required setup is readily available. 

    Figure 03

    For the database setup, we need key information, such as Host, Port, Username, and Password. You can find all this vital information in the “config” directory’s “development.yaml” and “passwords.yaml” files. If you encounter difficulties locating these details, please refer to Figure 04.

    Figure 04

    Step 10: To establish the connection, you can install an application similar to Postico or, alternatively, I recommend using the MySQL extension, which can be installed in your VS Code with just one click.

    Figure 05

    Step 11: Follow these next steps:

    1. Select the “Database” option.
    2. Click on “Create Connection.”
    3. Choose the “PostgreSQL” option.
    4. Add a name for your Connection.
    5. Fill in the information collected in the last step.
    6. Finally, select the “Connect” option.
    Figure 06
    1. Upon success, you will receive a “Connect Success!” message, and the new connection will be added to the Explorer Tab.
    Figure 07

    Step 12: Now, we shift our focus to the Flutter project (Frontend):

    Thus far, we have been working on the server project. Let us open a new VS Code instance for a separate Flutter project while keeping the current VS Code instance active, serving as the server.

    Step 13: Execute the following command to run the Flutter project on Chrome:

    flutter run -d chrome

    With this, the default project will generate the following output:

    Step 14: When you are finished, you can shut down Serverpod with “Ctrl-C.”

    Step 15: Then stop Postgres and Redis.

    docker compose stop

    Figure 08

    3. Sample Project

    So far, we have successfully created and executed the project, identifying three distinct components. The server project caters to server/backend development, while the Flutter project handles application/frontend development. The client project, automatically generated, serves as the vital intermediary, bridging the gap between the frontend and backend.

    However, merely acknowledging the projects’ existence is insufficient. To maximize our proficiency, it is crucial to grasp the code and file structure comprehensively. To achieve this, we will embark on a practical journey, constructing a small project to gain hands-on experience and unlock deeper insights into all three components. This approach empowers us with a well-rounded understanding, further enhancing our capabilities in building remarkable applications.

    3.1. What are we building?

    In this blog, we will construct a sample project with basic Login and SignUp functionality. The SignUp process will collect user information such as Email, Password, Username, and age. Users can subsequently log in using their email and password, leading to the display of user details on the dashboard screen. With the initial system setup complete and the newly created project up and running, it’s time to commence coding. 

    3.1.1 Create custom models for API endpoints

    Step1: Create a new file in the “lib >> src >> protocol” directory named “users.yaml”:

    class: Users
    table: users
    fields:
      username: String
      email: String
      password: String
      age: int

    Step 2: Save the file and run the following command to generate essential data classes and table creation queries:

    serverpod generate

    (Note: Add “–watch” after the command for continuous code generation). 

    Successful execution of the above command will generate a new file named “users.dart” in the “lib >> src >> generated” folder. Additionally, the “tables.pgsql” file now contains SQL queries for creating the “users” table. The same command updates the auto-generated code in the client project. 

    3.1.2 Create Tables in DB for the generated model 

    Step 1: Copy the queries written in the “generated >> tables.pgsql” file.

    In the MySQL Extension’s Database section, select the created database >> [project_name] >> public >> Tables >> + (Create New Table).

    Figure 09

    Step 2: Paste the queries into the newly created .sql file and click “Execute” above both queries.

    Figure 10

    Step 3: After execution, you will obtain an empty table with the “id” as the Primary key.

    Figure 11

    If you found multiple tables already present in your database like shown in the next figure, you can ignore those. These tables are created by the system with queries present in the “generated >> tables-serverpod.pgsql” file.

    Figure 12

    3.1.3 Create an API endpoint

    Step 1: Generate a new file in the “lib >> src >> endpoints” directory named “session_endpoints.dart”:

    class SessionEndpoint extends Endpoint {
      Future<Users?> login(Session session, String email, String password) async {
        List<Users> userList = await Users.find(session,
            where: (p0) =>
                (p0.email.equals(email)) & (p0.password.equals(password)));
        return userList.isEmpty ? null : userList[0];
      }
    
    
      Future<bool> signUp(Session session, Users newUser) async {
        try {
          await Users.insert(session, newUser);
          return true;
        } catch (e) {
          print(e.toString());
          return false;
        }
      }
    }

    If “serverpod generate –watch” is already running, you can ignore this step 2.

    Step 2: Run the command:

    serverpod generate

    Step 3: Start the server.
    [For help, check out Step 1 Step 6 mentioned in Project Execution part.]

    3.1.3 Create three screens

    Login Screen:

    Figure 13

    SignUp Screen:

    Figure 14

    Dashboard Screen:

    Figure 15

    3.1.4 Setup Flutter code

    Step 1: Add the code provided to the SignUp button in the SignUp screen to handle user signups.

    try {
            final result = await client.session.signUp(
              Users(
                email: _emailEditingController.text.trim(),
                username: _usernameEditingController.text.trim(),
                password: _passwordEditingController.text.trim(),
                age: int.parse(_ageEditingController.text.trim()),
              ),
            );
            if (result) {
              Navigator.pop(context);
            } else {
              _errorText = 'Something went wrong, Try again.';
            }
          } catch (e) {
            debugPrint(e.toString());
            _errorText = e.toString();
          }

    Step 2: Add the code provided to the Login button in the Login screen to handle user logins.

    try {
            final result = await client.session.login(
              _emailEditingController.text.trim(),
              _passwordEditingController.text.trim(),
            );
            if (result != null) {
              _emailEditingController.text = '';
              _passwordEditingController.text = '';
              Navigator.push(
                context,
                MaterialPageRoute(
                  builder: (context) => DashboardPage(user: result),
                ),
              );
            } else {
              _errorText = 'Something went wrong, Try again.';
            }
          } catch (e) {
            debugPrint(e.toString());
            _errorText = e.toString();
          }

    Step 3: Implement logic to display user data on the dashboard screen.

    With these steps completed, our Flutter app becomes a fully functional project, showcasing the power of this new technology. Armed with Dart knowledge, every Flutter developer can transform into a proficient full-stack developer.

    4. Result

    Figure 16

    To facilitate your exploration, the entire project code is conveniently available in this code repository. Feel free to refer to this repository for an in-depth understanding of the implementation details and access to the complete source code, enabling you to delve deeper into the project’s intricacies and leverage its functionalities effectively.

    5. Conclusion

    In conclusion, we have provided a comprehensive walkthrough of the step-by-step setup process for running Serverpod seamlessly. We explored creating data models, integrating the database with our server project, defining tables, executing data operations, and establishing accessible API endpoints for Flutter applications.

    Hopefully, this blog post has kindled your curiosity to delve deeper into Serverpod’s immense potential for elevating your Flutter applications. Embracing Serverpod unlocks a world of boundless possibilities, empowering you to achieve remarkable feats in your development endeavors.

    Thank you for investing your time in reading this informative blog!

    6. References

    1. https://docs.flutter.dev/
    2. https://pub.dev/packages/serverpod/
    3. https://serverpod.dev/
    4. https://docs.docker.com/get-docker/
    5. https://medium.com/serverpod/introducing-serverpod-a-complete-backend-for-flutter-written-in-dart-f348de228e19
    6. https://medium.com/serverpod/serverpod-our-vision-for-a-seamless-scalable-backend-for-the-flutter-community-24ba311b306b
    7. https://stackoverflow.com/questions/76180598/serverpod-sql-error-when-starting-a-clean-project
    8. https://www.youtube.com/watch?v=3Q2vKGacfh0
    9. https://www.youtube.com/watch?v=8sCxWBWhm2Y

  • Setting up Mutual TLS Authentication and Authorization on Amazon MSK

    Overview

    We will cover how to set up mutual TLS authentication and authorization on Amazon MSK.

    Amazon MSK is a fully managed service that makes it easy to build and run applications that use Apache Kafka to process streaming data. You can enable client authentication with TLS for connections and client authorization from your applications to your Amazon MSK brokers and ZooKeeper nodes. 

    Prerequisites

    • Terraform: For creating a private CA and MSK Cluster
    • AWS CLI: For creating TLS certificates (the user must have access to create a private CA, issue certificates, and create MSK cluster)

    Setup TLS authentication and authorization

    To use client authentication with TLS on MSK, you need to create the following resources:

    • AWS Private CA
    • MSK cluster with TLS encryption enabled
    • Client certificates

    Create AWS Private CA

    AWS Private CA can be either in the same AWS account as your cluster, or in a different account. For information about AWS Private CAs, see Creating and Managing a AWS Private CA. In this setup, we will use Terraform to create a private CA.

    Steps to create Private CA

    1. Run below Terraform code to create the Private CA.
    terraform {
    required_providers {
    aws = {
          source  = "hashicorp/aws"
          version = "~> 4.0"
        }
      }
    }
    resource "aws_acmpca_certificate_authority" "root_ca" {
    certificate_authority_configuration {
    key_algorithm     = "RSA_4096"
    signing_algorithm = "SHA512WITHRSA"
    subject {
    #Update the attributes as per your need
    common_name         = "exp-msk-ca"
    country             = "US"
    locality            = "Seattle"
    organization        = "Example Corp"
    organizational_unit = "Sales"
    state               = "WA"
        }
      }
    type = "ROOT"
    }

    1. Once the private CA is created, install the certificate from the AWS console.

    Steps to install the certificate.

    • If you are not already on the CA’s details page, open the AWS Private CA console at https://console.aws.amazon.com/acm-pca/home. On the private certificate authorities page, choose a root CA that you have created with the certificate status as Pending or Active.
    • Choose Actions, and installthe  CA certificate to open the Install root CA certificate page.
    • Under Specify the root CA certificate parameters, specify the following certificate parameters:
    • Validity — Specifies the expiration date and time for the CA certificate. The AWS Private CA default validity period for a root CA certificate is ten years.
    • Signature algorithm — Specifies the signing algorithm to use when the root CA issues new certificates. Available options vary according to the AWS Region where you are creating the CA. For more information, see Compatible signing algorithms, Supported cryptographic algorithms, and SigningAlgorithm in CertificateAuthorityConfiguration.
    • SHA256 RSA
    • Review your settings to make sure they’re correct, then choose Confirm and install.        
    • The details page for the CA displays the status of the installation (success or failure) at the top. If the installation was successful, the newly completed root CA displays a status of Active in the General pane.

    Create an MSK cluster that supports TLS client authentication.

    Note: We highly recommend using independent AWS Private CA for each MSK cluster when you use mutual TLS to control access. Doing so will ensure that TLS certificates signed by PCAs only authenticate with a single MSK cluster.

    Run the below Terraform code to create MSK cluster

    Note: Update attributes as per the requirement and configurations.

    terraform {
      required_providers {
        aws = {
          source  = "hashicorp/aws"
          version = "~> 4.0"
        }
      }
    }
    module "kafka" {
      source = "cloudposse/msk-apache-kafka-cluster/aws"
      # Cloud Posse recommends pinning every module to a specific version
      version                       = "2.3.0"
      name                          = "test-msk-cluster" #Change MSK cluster name as per your need
      vpc_id                        = "<VPC_ID>" 
      subnet_ids                    = ["SUBNET1a","SUNBNET2b"] # Minimum 2 subnets required.
      kafka_version                 = "3.4.0" #recommended version by AWS as of 19 Sep 2022
      broker_per_zone               = 1 #Number of broker per availability zone
      broker_instance_type          = "kafka.t3.small" #MSK instance types
      broker_volume_size            = 10 #Broker disk size
      certificate_authority_arns    = ["<CA_ARN>"]  #arn of the CA that you have created in the earlier step
      client_tls_auth_enabled       = true
      encryption_in_cluster         = true 
      client_broker                 = "TLS" # Enables TLS encryption
      enhanced_monitoring           = "PER_TOPIC_PER_BROKER"
      cloudwatch_logs_enabled       = false # Enable if you need cloudwatch logs
      jmx_exporter_enabled          = false # Enable if you need jmx metrics
      node_exporter_enabled         = false # Enable if you need node metrics
      associated_security_group_ids = ["${aws_security_group.kafka_sg.id}"]
      allowed_security_group_ids    = ["${aws_security_group.kafka_sg.id}"]
      create_security_group         = false
    }
    #-----------------------End--------------------#
    resource "aws_security_group" "kafka_sg" {
      name        = "test-msk-cluster-sg" #Change the name as per your need 
      description = "Security Group for kafka cluster"
      vpc_id      = "<VPC_ID>"
      egress {
        from_port        = 0
        to_port          = 0
        protocol         = "-1"
        cidr_blocks      = ["0.0.0.0/0"]
        ipv6_cidr_blocks = ["::/0"]
      }
      ingress {
        from_port        = 2181
        to_port          = 2181
        protocol         = "tcp"
        cidr_blocks      = ["0.0.0.0/0"]
        ipv6_cidr_blocks = ["::/0"]
      }
      ingress {
        from_port        = 9094
        to_port          = 9094
        protocol         = "tcp"
        cidr_blocks      = ["0.0.0.0/0"]
        ipv6_cidr_blocks = ["::/0"]
      }
      # Enable if you need to add tags to MSK cluster
      #tags = var.tags
      # Enable if you need cloudwatch logs
      # depends_on = [
      #   aws_cloudwatch_log_group.cw_log_group
      #]
    }
    # Required for cloudwatch logs
    # resource "aws_cloudwatch_log_group" "cw_log_group" {
    #   name = "blog-msk-cluster"
    #   #tags = var.tags
    # }
    
    output "bootstrap_url" {
      value       = module.kafka.bootstrap_brokers_tls
      description = "Comma separated list of one or more DNS names (or IP addresses) and TLS port pairs for access to the Kafka cluster using TLS"
    }

    It will take 15-20 minutes to create the MSK cluster. 

    Note: Since the bootstrap URL will be used to communicate with the MSK cluster using the Kafka CLI or SDKs, save it from the Terraform output.

    Create TLS certificates using previously created AWS Private CA

    We will create two certificates, one is for admin access, and the other one is for client access. For creating certificates, a common name (CN) is required. The CN is used as a principal while granting permissions through kafka ACLs

    Create admin TLS certificate

    Steps to create TLS certificate

    1. Generate CSR and key.
    openssl req -newkey rsa:2048 -keyout key.pem -out cert.csr -batch -nodes -subj '/CN=admin'

    1. Issue certificate using previously created private CA (replace <CA_ARN> with the ARN of the AWS Private CA that you created).
    certArn=$(aws acm-pca issue-certificate --region <region> --certificate-authority-arn "<CA_ARN>" 
    --csr fileb://cert.csr 
    --signing-algorithm 'SHA256WITHRSA' --validity Value=180,Type='DAYS' --query 
    'CertificateArn' --output text)

    1. Get the certificate ARN issued in the previous step.
    aws acm-pca get-certificate --region <region> --certificate-authority-arn 
    "<CA_ARN>" --certificate-arn "${certArn}" --output text | sed 's/t/n/g' > 
    cert.pem

    1. Export the certificate in pkcs12 format.
    openssl pkcs12 -export -in cert.pem -inkey key.pem -name ssl-configurator 
    -password pass: -out admin.p12

    Create client TLS certificate

    1. Generate CSR and key
    openssl req -newkey rsa:2048 -keyout key.pem -out cert.csr -batch -nodes -subj 
    '/CN=client'

    1. Issue certificate using previously created private CA (replace <CA_ARN> with the ARN of the AWS Private CA that you created).
    certArn=$(aws acm-pca issue-certificate --region <region> 
    --certificate-authority-arn "<CA_ARN>" --csr fileb://cert.csr 
    --signing-algorithm 'SHA256WITHRSA' --validity Value=180,Type='DAYS' --query 
    'CertificateArn' --output text)

    1. Get certificate ARN issue in the previous step.
    aws acm-pca get-certificate --region <region> --certificate-authority-arn 
    "<CA_ARN>" --certificate-arn "${certArn}" --output text | sed 's/t/n/g' > 
    cert.pem

    1. Export the certificate in pkcs12 format.
    openssl pkcs12 -export -in cert.pem -inkey key.pem -name ssl-configurator 
    -password pass: -out client.p12

    Setup a client machine to interact with the MSK cluster

    1. Create an Amazon EC2 instance to use as a client machine. For simplicity, create this instance in the same VPC you used for the cluster. See Step 3: Create a client machine for an example of how to create such a client machine.
    2. Copy previously created certificates admin.p12 and client.p12 into the client machine.
    3. Install java8+ on the client machine.
    4. Download Kafka binaries and extract
    https://archive.apache.org/dist/kafka/3.4.0/kafka_2.13-3.4.0.tgz
    1. Create admin and client configuration files for authentication  and authorization.
    cat <<EOF> admin.propertie
    bootstrap.servers="<BOOTSTRAP_URL>"
    security.protocol=SSL
    ssl.keystore.location=./admin.p12
    ssl.keystore.type=PKCS12
    ssl.keystore.password=
    EOF
    cat <<EOF> client.propertie
    bootstrap.servers="<BOOTSTRAP_URL>"
    security.protocol=SSL
    ssl.keystore.location=./client.p12
    ssl.keystore.type=PKCS12
    ssl.keystore.password=
    EOF

    Test Authentication and Authorization using ACLs

    Create Admin ACLs for granting admin access to clusters, topics, and groups

    By default, the MSK cluster will allow everyone if no ACL is found. Here the Admin ACL will be the first ACL. The Admin user (“User:CN=admin”) will leverage on Admin ACL to grant permissions to Client User(“User:CN=client”).

    ACL for managing cluster operations (Admin ACL).
    ./kafka_2.13-3.5.0/bin/kafka-acls.sh 
    --add 
    --allow-principal "User:CN=admin" 
    --operation All 
    --cluster 
    --bootstrap-server "<BOOTSTRAP_URL>" 
    --command-config admin.properties

    ACL for managing topics permissions (Admin ACL).
    ./kafka_2.13-3.5.0/bin/kafka-acls.sh 
    --add 
    --allow-principal "User:CN=admin" 
    --operation All 
    --topic "*" 
    --bootstrap-server "<BOOTSTRAP_URL>" 
    --command-config admin.properties

    ACL for managing group permissions (Admin ACL).
    ./kafka_2.13-3.5.0/bin/kafka-acls.sh 
    --add 
    --allow-principal "User:CN=admin" 
    --operation All 
    --group "*" 
    --bootstrap-server "<BOOTSTRAP_URL>" 
    --command-config admin.properties

    Create a topic.

    ./kafka_2.13-3.5.0/bin/kafka-topics.sh --bootstrap-server "<BOOTSTRAP_URL>" 
    --create --topic test-topic --command-config admin.properties

    List topic and check the topic is created.

    ./kafka_2.13-3.5.0/bin/kafka-topics.sh --bootstrap-server "<BOOTSTRAP_URL>" 
    --list --command-config admin.properties

    Grant write permission to the topic so that client (producer) can publish messages to the topic (Use admin user for granting access to client).

    ./kafka_2.13-3.5.0/bin/kafka-acls.sh 
    --add 
    --allow-principal "User:CN=client" 
    --operation Write 
    --topic "test-topic" 
    --bootstrap-server "<BOOTSTRAP_URL>" 
    --command-config admin.properties

    Publish messages to the topic using client user.

    for x in {1..10}; do echo "message $x"; done | 
    ./kafka_2.13-3.5.0/bin/kafka-console-producer.sh --bootstrap-server 
    "<BOOTSTRAP_URL>" --producer.config kafka-admin-client.properties --topic 
    test-topic --producer-property enable.idempotence=false

    Consume messages.

    Note: If you try to consume messages from the topic using a consumer group, you will get a group authorization error since the client user is not authorized to access groups.

    ./kafka_2.13-3.5.0/bin/kafka-console-consumer.sh --bootstrap-server 
    "<BOOTSTRAP_URL>" --topic test-topic --max-messages 2 --consumer-property 
    enable.auto.commit=false --consumer-property group.id=consumer-test 
    --from-beginning --consumer.config client.properties

    Grant group permission to the client user.

    ./kafka_2.13-3.5.0/bin/kafka-acls.sh 
    --add 
    --allow-principal "User:CN=client" 
    --operation Read 
    --resource-pattern-type prefixed 
    --group 'consumer-' 
    --bootstrap-server "<BOOTSTRAP_URL>" 
    --command-config admin.properties

    After providing group access, the client user should be able to consume messages from the topic using a consumer group.

    This way, you can manage client access to the topics and groups.

    Additional Commands 

    List ACL.
    ./kafka_2.13-3.5.0/bin/kafka-acls.sh 
    --bootstrap-server "<BOOTSTRAP_URL>" 
    --list 
    --command-config admin.properties

    Delete ACL (Create and delete ACL commands are same except –aad/–remove argument).
    ./kafka_2.13-3.5.0/bin/kafka-acls.sh 
    --remove 
    --allow-principal "User:CN=admin" 
    --operation All 
    --cluster 
    --bootstrap-server "<BOOTSTRAP_URL>" 
    --command-config admin.properties

    Conclusion

    AWS MSK eases the effort of managing independently hosted Kafka clusters. Users can scale Kafka brokers and storage as necessary. MSK supports TLS encryption and allows users to create TLS connections from the application to Amazon MSK brokers and ZooKeeper nodes with the help of the AWS Private CA, which enables users to create certificates for authentication.

  • Integrating Augmented Reality in a Flutter App to Enhance User Experience

    In recent years, augmented reality (AR) has emerged as a cutting-edge technology that has revolutionized various industries, including gaming, retail, education, and healthcare. Its ability to blend digital information with the real world has opened up a new realm of possibilities. One exciting application of AR is integrating it into mobile apps to enhance the user experience.

    In this blog post, we will explore how to leverage Flutter, a powerful cross-platform framework, to integrate augmented reality features into mobile apps and elevate the user experience to new heights.

    Understanding Augmented Reality:‍

    Before we dive into the integration process, let’s briefly understand what augmented reality is. Augmented reality is a technology that overlays computer-generated content onto the real world, enhancing the user’s perception and interaction with their environment. Unlike virtual reality (VR), which creates a fully simulated environment, AR enhances the real world by adding digital elements such as images, videos, and 3D models.

    The applications of augmented reality are vast and span across different industries. In gaming, AR has transformed mobile experiences by overlaying virtual characters and objects onto the real world. It has also found applications in areas such as marketing and advertising, where brands can create interactive campaigns by projecting virtual content onto physical objects or locations. AR has also revolutionized education by offering immersive learning experiences, allowing students to visualize complex concepts and interact with virtual models.

    In the upcoming sections, we will explore the steps to integrate augmented reality features into mobile apps using Flutter.

    ‍What is Flutter?‍

    Flutter is an open-source UI (user interface) toolkit developed by Google for building natively compiled applications for mobile, web, and desktop platforms from a single codebase. It allows developers to create visually appealing and high-performance applications with a reactive and customizable user interface.

    The core language used in Flutter is Dart, which is also developed by Google. Dart is a statically typed, object-oriented programming language that comes with modern features and syntax. It is designed to be easy to learn and offers features like just-in-time (JIT) compilation during development and ahead-of-time (AOT) compilation for optimized performance in production.

    Flutter provides a rich set of customizable UI widgets that enable developers to build beautiful and responsive user interfaces. These widgets can be composed and combined to create complex layouts and interactions, giving developers full control over the app’s appearance and behavior.

    Why Choose Flutter for AR Integration?

    Flutter, backed by Google, is a versatile framework that enables developers to build beautiful and performant cross-platform applications. Its rich set of UI components and fast development cycle make it an excellent choice for integrating augmented reality features. By using Flutter, developers can write a single codebase that runs seamlessly on both Android and iOS platforms, saving time and effort.

    Flutter’s cross-platform capabilities enable developers to write code once and deploy it on multiple platforms, including iOS, Android, web, and even desktop (Windows, macOS, and Linux).

    The Flutter ecosystem is supported by a vibrant community, offering a wide range of packages and plugins that extend its capabilities. These packages cover various functionalities such as networking, database integration, state management, and more, making it easy to add complex features to your Flutter applications.’

    ‍Steps to Integrate AR in a Flutter App:

    Step 1: Set Up Flutter Project:

    Assuming that you already have Flutter installed in your system, create a new Flutter project or open an existing one to start integrating AR features. If not, then follow this https://docs.flutter.dev/get-started/install to set up Flutter.

    Step 2: Add ar_flutter_plugin dependency:

    Update the pubspec.yaml file of your Flutter project and add the following line under the dependencies section:

    dependencies:
    ar_flutter_plugin: ^0.7.3.

    This step ensures that your Flutter project has the necessary dependencies to integrate augmented reality using the ar_flutter_plugin package.
    Run `flutter pub get` to fetch the package.

    Step 3: Initializing the AR View:

    Create a new Dart file for the AR screen. Import the required packages at the top of the file:

    Define a new class called ARScreen that extends StatefulWidget and State. This class represents the AR screen and handles the initialization and rendering of the AR view:

    class ArScreen extends StatefulWidget {  
      const ArScreen({Key? key}) : super(key: key);  
      @override  
      _ArScreenState createState() => _ArScreenState();
    }‍
      class _ArScreenState extends State<ArScreen> {  
      ARSessionManager? arSessionManager;  
      ARObjectManager? arObjectManager;  
      ARAnchorManager? arAnchorManager;‍  
        List<ARNode> nodes = [];  
      List<ARAnchor> anchors = [];‍  
        @override  
        void dispose() {    
        super.dispose();    
        arSessionManager!.dispose();  }‍  
        @override  
        Widget build(BuildContext context) {    
        return Scaffold(        
          appBar: AppBar(          
            title: const Text('Anchors & Objects on Planes'),        
          ),        
          body: Stack(children: [          
            ARView(        
              onARViewCreated: onARViewCreated,        
              planeDetectionConfig: PlaneDetectionConfig.horizontalAndVertical,          
            ),          
            Align(        
              alignment: FractionalOffset.bottomCenter,        
              child: Row(            
                mainAxisAlignment: MainAxisAlignment.spaceEvenly,            
                children: [              
                  ElevatedButton(                  
                    onPressed: onRemoveEverything,                  
                    child: const Text("Remove Everything")),            
                ]),          
            )        
          ]));  
      }

    Step 4: Add AR functionality:

    Create a method onARViewCreated for the onArCoreViewCreated callback. You can add “required” AR functionality in this method, such as loading 3D models or handling interactions. In our demo, we will be adding 3D models in AR on tap:

    void onARViewCreated(
          ARSessionManager arSessionManager,
          ARObjectManager arObjectManager,
          ARAnchorManager arAnchorManager,
          ARLocationManager arLocationManager) {
        this.arSessionManager = arSessionManager;
        this.arObjectManager = arObjectManager;
        this.arAnchorManager = arAnchorManager;
    
        this.arSessionManager!.onInitialize(
              showFeaturePoints: false,
              showPlanes: true,
              customPlaneTexturePath: "Images/triangle.png",
              showWorldOrigin: true,
            );
        this.arObjectManager!.onInitialize();
    
        this.arSessionManager!.onPlaneOrPointTap = onPlaneOrPointTapped;
        this.arObjectManager!.onNodeTap = onNodeTapped;
      }

    After this, create a method onPlaneOrPointTapped for handling interactions.

    Future<void> onPlaneOrPointTapped(
          List<ARHitTestResult> hitTestResults) async {
        var singleHitTestResult = hitTestResults.firstWhere(
            (hitTestResult) => hitTestResult.type == ARHitTestResultType.plane);
        var newAnchor =
            ARPlaneAnchor(transformation: singleHitTestResult.worldTransform);
        bool? didAddAnchor = await arAnchorManager!.addAnchor(newAnchor);
        if (didAddAnchor!) {
          anchors.add(newAnchor);
          // Add note to anchor
          var newNode = ARNode(
              type: NodeType.webGLB,
              uri:
    "https://github.com/KhronosGroup/glTF-Sample-Models/raw/master/2.0/Duck/glTF-Binary/Duck.glb",
              scale: Vector3(0.2, 0.2, 0.2),
              position: Vector3(0.0, 0.0, 0.0),
              rotation: Vector4(1.0, 0.0, 0.0, 0.0));
          bool? didAddNodeToAnchor = await arObjectManager!
              .addNode(newNode, planeAnchor: newAnchor);
          if (didAddNodeToAnchor!) {
            nodes.add(newNode);
          } else {
            arSessionManager!.onError("Adding Node to Anchor failed");
          }
        } else {
          arSessionManager!.onError("Adding Anchor failed");
        }
      }

    Finally, create a method for onRemoveEverything to remove all the elements on the screen.

    Future<void> onRemoveEverything() async {
           for (var anchor in anchors) {
          arAnchorManager!.removeAnchor(anchor);
        }
        anchors = [];
      }

    Step 5: Run the AR screen:

    In your app’s main entry point, set the ARScreen as the home screen:

    void main() {
      runApp(MyApp());
    }
    
    class MyApp extends StatelessWidget {
      @override
      Widget build(BuildContext context) {
        return MaterialApp(
          home: ARScreen(),
        );
      }
    }

    In the example below, we can observe the AR functionality implemented. We are loading a Duck 3D Model whenever the user taps on the screen. The plane is auto-detected, and once that is done, we can add a model to it. We also have a floating button to remove everything that is on the plane at the given moment.

    ‍Benefits of AR Integration:

    • Immersive User Experience: Augmented reality adds an extra dimension to user interactions, creating immersive and captivating experiences. Users can explore virtual objects within their real environment, leading to increased engagement and satisfaction.
    • Interactive Product Visualization: AR allows users to visualize products in real-world settings before making a purchase. They can view how furniture fits in their living space, try on virtual clothes, or preview architectural designs. This interactive visualization enhances decision-making and improves customer satisfaction.
    • Gamification and Entertainment: Augmented reality opens up opportunities for gamification and entertainment within apps. You can develop AR games, quizzes, or interactive storytelling experiences, providing users with unique and enjoyable content.
    • Marketing and Branding: By incorporating AR into your Flutter app, you can create innovative marketing campaigns and branding experiences. AR-powered product demonstrations, virtual try-ons, or virtual showrooms help generate excitement around your brand and products.

    Conclusion:

    Integrating augmented reality into a Flutter app brings a new level of interactivity and immersion to the user experience. Flutter’s versatility with AR frameworks like ARCore and ARKit, empowers developers to create captivating and innovative mobile applications. By following the steps outlined in this blog post, you can unlock the potential of augmented reality and deliver exceptional user experiences that delight and engage your audience. Embrace the possibilities of AR in Flutter and embark on a journey of exciting and immersive app development.

  • Unveiling the Magic of Kubernetes: Exploring Pod Priority, Priority Classes, and Pod Preemption

    ‍Introduction:

    Generally, during the deployment of a manifest, we observe that some pods get successfully scheduled, while few critical pods encounter scheduling issues. Therefore, we must schedule the critical pods first over other pods. While exploring, we discovered a built-in solution for scheduling using Pod Priority and Priority Class. So, in this blog, we’ll be talking about Priority Class and Pod Priority and how we can implement them in our use case.

    Pod Priority:

    It is used to prioritize one pod over another based on its importance. Pod Priority is particularly useful when critical pods cannot be scheduled due to limited resources.

    Priority Classes:

    This Kubernetes object defines the priority of pods. Priority can be set by an integer value. Higher-priority values have higher priority to the pod.

    Understanding Priority Values:

    Priority Classes in Kubernetes are associated with priority values that range from 0 to 1000000000, with a higher value indicating greater importance.

    These values act as a guide for the scheduler when allocating resources. 

    Pod Preemption:

    It is already enabled when we create a priority class. The purpose of Pod Preemption is to evict lower-priority pods in order to make room for higher-priority pods to be scheduled.

    Example Scenario: The Enchanted Shop

    Let’s dive into a scenario featuring “The Enchanted Shop,” a Kubernetes cluster hosting an online store. The shop has three pods, each with a distinct role and priority:

    Priority Class:

    • Create High priority class: 
    apiVersion: scheduling.k8s.io/v1
    kind: PriorityClass
    metadata:
      name: high-priority
    value: 1000000

    • Create Medium priority class:
    apiVersion: scheduling.k8s.io/v1
    kind: PriorityClass
    metadata:
      name: medium-priority
    value: 500000

    • Create Low priority class:
    apiVersion: scheduling.k8s.io/v1
    kind: PriorityClass
    metadata:
      name: low-priority
    value: 100000

    Pods:

    • Checkout Pod (High Priority): This pod is responsible for processing customer orders and must receive top priority.

    Create the Checkout Pod with a high-priority class:

    apiVersion: v1
    kind: Pod
    metadata:
      name: checkout-pod
      labels:
        app: checkout
    spec:
      priorityClassName: high-priority
      containers:
      - name: checkout-container
        image: nginx:checkout

    • Product Recommendations Pod (Medium Priority):

    This pod provides personalized product recommendations to customers and holds moderate importance.

    Create the Product Recommendations Pod with a medium priority class:

    apiVersion: v1
    kind: Pod
    metadata:
      name: product-rec-pod
      labels:
        app: product-recommendations
    spec:
      priorityClassName: medium-priority
      containers:
      - name: product-rec-container
        image: nginx:store

    • Shopping Cart Pod (Low Priority):

    This pod manages customers’ shopping carts and has a lower priority compared to the others.

    Create the Shopping Cart Pod with a low-priority class:

    apiVersion: v1
    kind: Pod
    metadata:
      name: shopping-cart-pod
      labels:
        app: shopping-cart
    spec:
      priorityClassName: low-priority
      containers:
      - name: shopping-cart-container
        image: nginx:cart

    With these pods and their respective priority classes, Kubernetes will allocate resources based on their importance, ensuring smooth operation even during peak loads.

    Commands to Witness the Magic:

    • Verify Priority Classes:

    kubectl get priorityclasses

    Note: Kubernetes includes two predefined Priority Classes: system-cluster-critical and system-node-critical. These classes are specifically designed to prioritize the scheduling of critical components, ensuring they are always scheduled first.

    • Check Pod Priority:

    Conclusion:

    In Kubernetes, you have the flexibility to define how your pods are scheduled. This ensures that your critical pods receive priority over lower-priority pods during the scheduling process. To get deeper into the concepts of Pod Priority, Priority Class, and Pod Preemption, you can find more information by referring to the following links.

  • Flame Engine : Unleashing Flutter’s Game Development Potential

    With Flutter, developers can leverage a single codebase to seamlessly build applications for diverse platforms, including Android, iOS, Linux, macOS, Windows, Google Fuchsia, and the web. The Flutter team remains dedicated to empowering developers of all backgrounds, ensuring effortless creation and publication of applications using this powerful multi-platform UI toolkit.
    Flutter simplifies the process of developing standard applications effortlessly. However, if your aim is to craft an extraordinary game with stunning graphics, captivating gameplay, lightning-fast loading times, and highly responsive interactions, Flames emerges as the perfect solution.s
    This blog will provide you with an in-depth understanding of Flame. Through the features provided by Flame, you will embark on a journey to master the art of building a Flutter game from the ground up. You will gain invaluable insights into seamlessly integrating animations, configuring immersive soundscapes, and efficiently managing diverse game assets.

    1. Flame engine

    Flame is a cutting-edge 2D modular game engine designed to provide a comprehensive suite of specialized solutions for game development. Leveraging the powerful architecture of Flutter, Flame significantly simplifies the coding process, empowering you to create remarkable projects with efficiency and precision.

    1.1. Setup: 

    Run this command with Flutter:

    $ flutter pub add flame

    This will add a line like this to your package’s pubspec.yaml (and run an implicit flutter pub get):

    Dependencies:
    Flame: ^1.8.1

    Import it, and now, in your Dart code, you can use:

    import 'package:flame/flame.dart';

    1.2. Assets Structure: 

    Flame introduces a well-structured assets directory framework, enabling seamless utilization of these resources within your projects.
    To illustrate the concepts further, let’s delve into a practical example that showcases the application of the discussed principles:

    Flame.images.load(‘card_sprites.png');  	
    FlameAudio.play('shuffling.mp3');

    When utilizing image and audio assets in Flame, you can simply specify the asset name without the need for the full path, given that you place the assets within the suggested directories as outlined below.

    For better organization, you have the option to divide your audio folder into two distinct subfolders: music and sfx

    The music directory is intended for audio files used as background music, while the sfx directory is specifically designated for sound effects, encompassing shots, hits, splashes, menu sounds, and more.

    To properly configure your project, it is crucial to include the entry of above-mentioned directories in your pubspec.yaml file:

    1.3. Support to other platforms: 

    As Flame is built upon the robust foundation of Flutter, its platform support is inherently reliant on Flutter’s compatibility with various platforms. Therefore, the range of platforms supported by Flame is contingent upon Flutter’s own platform support.

    Presently, Flame offers extensive support for desktop platforms such as Windows, MacOS, and Linux, in addition to mobile platforms, including Android and iOS. Furthermore, Flame also facilitates game development for the web. It is important to note that Flame primarily focuses on stable channel support, ensuring a reliable and robust experience. While Flame may not provide direct assistance for the dev, beta, and master channels, it is expected that Flame should function effectively in these environments as well.

    1.3.1. Flutter web: 

    To optimize the performance of your web-based game developed with Flame, it is recommended to ensure that your game is utilizing the CanvasKit/Skia renderer. By leveraging the canvas element instead of separate DOM elements, this choice enhances web performance significantly. Therefore, incorporating the CanvasKit/Skia renderer within your Flame-powered game is instrumental in achieving optimal performance on the web platform.

    To run your game using Skia, use the following command:

    flutter run -d chrome --web-renderer canvaskit

    To build the game for production, using Skia, use the following:

    flutter build web --release --web-renderer canvaskit

    2. Implementation

    2.1 GameWidget: 

    To integrate a Game instance into the Flutter widget tree, the recommended approach is to utilize the GameWidget. This widget serves as the root of your game application, enabling seamless integration of your game. You can incorporate a Game instance into the widget tree by following the example provided below:

    void main() {
      runApp(
        GameWidget(game: MyGame()),
      );
    }

    By adopting this approach, you can effectively add your Game instance to the Flutter widget tree, ensuring proper execution and integration of your game within the Flutter application structure.

    2.2 GameWidget:

    When developing games in Flutter, it is crucial to utilize a widget that can efficiently handle high refresh rates, speedy memory allocation, and deallocation and provide enhanced functionality compared to the Stateless and Stateful widgets. Flame offers the FlameGame class, which excels in providing these capabilities.

    By utilizing the FlameGame class, you can create games by adding components to it. This class automatically calls the update and render methods of all the components added to it. Components can be directly added to the FlameGame through the constructor using the named children argument, or they can be added from anywhere else using the add or addAll methods.

    To incorporate the FlameGame into the widget tree, you need to pass its object to the GameWidget. Refer to the example below for clarification:

    class CardMatchGame extends FlameGame {
      @override
      Future<void> onLoad() async {
        await add(CardTable());
      }
    }
    
    main() {
      final cardMatchGame = CardMatchGame(children: [CardTable]);
      runApp(
        GameWidget(
          game: cardMatchGame,
        ),
      );
    }

    2.3 Component:

    This is the last piece of the puzzle. The smallest individual components that make up the game. This is like a widget but within the game. All components can have other components as children, and all components inherit from the abstract class Component. These components serve as the fundamental entities responsible for rendering and interactivity within the game, and their hierarchical organization allows for flexible and modular construction of complex game systems in Flame. These components have their own lifecycle. 

    Component Lifecycle: 

     

    Figure 01

    2.3.1. onLoad:

    The onLoad method serves as a crucial component within the game’s lifecycle, allowing for the execution of asynchronous operations such as image loading. Positioned between the onGameResize and onMount callbacks, this method is strategically placed to ensure the necessary assets are loaded and prepared. In Figure 01 of the component lifecycle, onLoad is set as the initial method due to its one-time execution. It is within this method that all essential assets, including images, audio files, and tmx files, should be loaded. This ensures that these assets are readily available for utilization throughout the game’s progression.

    2.3.2. onGameResize:

    Invoked when new components are added to the component tree or when the screen undergoes resizing, the onGameResize method plays a vital role in handling these events. It is executed before the onMount callback, allowing for necessary adjustments to be made in response to changes in component structure or screen dimensions.

    2.3.3. onParentResize:

    This method is triggered when the parent component undergoes a change in size or whenever the current component is mounted within the component tree. By leveraging the onParentResize callback, developers can implement logic that responds to parent-level resizing events and ensures the proper rendering and positioning of the component.

    2.3.4. onMount:

    As the name suggests, the onMount method is executed each time a component is mounted into the game tree. This critical method offers an opportunity to initialize the component and perform any necessary setup tasks before it becomes an active part of the game.

    2.3.5. onRemove:

    The onRemove method facilitates the execution of code just before a component is removed from the game tree. Regardless of whether the component is removed using the parent’s remove method or the component’s own remove method, this method ensures that the necessary cleanup actions take place in a single execution.

    2.3.6. onChildrenChanged:

    The onChildrenChanged method is triggered whenever a change occurs in a child component. Whether a child is added or removed, this method provides an opportunity to handle the updates and react accordingly, ensuring the parent component remains synchronized with any changes in its children.

    2.3.7. Render & Update Loop:

    The Render method is responsible for generating the user interface, utilizing the available data to create the game screen. It provides developers with canvas objects, allowing them to draw the game’s visual elements. On the other hand, the Update method is responsible for modifying and updating this rendered UI. Changes such as resizing, repositioning, or altering the appearance of components are managed through the Update method. In essence, any changes observed in the size or position of a component can be attributed to the Update method, which ensures the dynamic nature of the game’s user interface.

    3. Sample Project

    To showcase the practical implementation of key classes like GameWidget, FlameGame, and essential Components within the Flame game engine, we will embark on the creation of a captivating action game. By engaging in this hands-on exercise, you will gain valuable insights and hands-on experience in utilizing Flame’s core functionalities and developing compelling games. Through this guided journey, you will unlock the knowledge and skills necessary to create engaging and immersive gaming experiences, while harnessing the power of Flame’s robust framework.

    Let’s start with:

    3.1. Packages & assets: 

    3.1.1. Create a project using the following command:

    flutter create flutter_game_poc

    3.1.2. Add these under dependencies of pubspec.yaml (and run command flutter pub get):

    flame: ^1.8.0

    3.1.3. As mentioned earlier in the Asset Structure section, let’s create a directory called assets in your project and include an images subdirectory within it. Download assets from here, add both the assets to this  images directory.

    Figure 02 

    Figure 03

    In our game, we’ll use “Figure 02” as the background image and “Figure 03” as the avatar character who will be walking. If you have separate images for the avatar’s different walking frames, you can utilize a sprite generator tool to create a sprite sheet from those individual images.

    A sprite generator helps combine multiple separate images into a single sprite sheet, which enables efficient rendering and animation of the character in the game. You can find various sprite generator tools available online that can assist in generating a sprite sheet from your separate avatar images.

    By using a sprite sheet, you can easily manage and animate the character’s walking motion within the game, providing a smooth and visually appealing experience for the players.

    After uploading, your asset structure will look like this: 

    Figure 04

    3.1.4. To use these assets, we have to register them into pubspect.yaml below assets section: 

    assets: 
       -  assets/images/

    3.2. Supporting code: 

    3.2.1. Create 3 directories  constants, overlays, and components inside the lib directory.

    3.2.2. First, we will start with a constants directory where we have to create 4 files as follows:

       all_constants.dart. 

    export 'asset _constants.dart';
    export 'enum_ constants.dart';
    export 'key_constants.dart';

       assets_constants.dart. 

    class AssetConstants {
     static String backgroundImage = 'background.png';
     static String avatarImage = 'avatar_sprite.png';
    }

       enum_constants.dart. 

    enum WalkingDirection {idle, up, down, left, right};

       key_constants.dart. 

    class KeyConstants {
     static String overlayKey = 'DIRECTION_BUTTON';
    }

    3.2.3. In addition to the assets directory, we will create an overlay directory to include elements that need to be constantly visible to the user during the game. These elements typically include information such as the score, health, or action buttons.

    For our game, we will incorporate five control buttons that allow us to direct the gaming avatar’s movements. These buttons will remain visible on the screen at all times, facilitating player interaction and guiding the avatar’s actions within the game environment.

    Organizing these overlay elements in a separate directory makes it easier to manage and update the user interface components that provide vital information and interaction options to the player while the game is in progress.

    In order to effectively manage and control the position of all overlay widgets within our game, let’s create a dedicated controller. This controller will serve as a centralized entity responsible for orchestrating the placement and behavior of these overlay elements. Create a file named  overlay_controller.dart.

    All the files in the overlays directory are common widgets that extend Stateless widget.

    class OverlayController extends StatelessWidget {
     final WalkingGame game;
     const OverlayController({super.key, required this.game});
    
    
     @override
     Widget build(BuildContext context) {
       return Column(children: [
         Row(children: [ButtonOverlay(game: game)])
       ]);
     }
    }

    3.2.5. In our game, all control buttons share a common design, featuring distinct icons and functionalities. To streamline the development process and maintain a consistent user interface, we will create a versatile widget called DirectionButton. This custom widget will handle the uniform UI design for all control buttons.

    Inside the overlays directory, create a directory called widgets and add a file called direction_button.dart in that directory. This file defines the shape and color of all control buttons. 

    class DirectionButton extends StatelessWidget {
     final IconData iconData;
     final VoidCallback onPressed;
    
    
     const DirectionButton(
         {super.key, required this.iconData, required this.onPressed});
    
    
     @override
     Widget build(BuildContext context) {
       return Container(
         height: 40,
         width: 40,
         margin: const EdgeInsets.all(4),
         decoration: const BoxDecoration(
             color: Colors.black45,
             borderRadius: BorderRadius.all(Radius.circular(10))),
         child: IconButton(
           icon: Icon(iconData),
           iconSize: 20,
           color: Colors.white,
           onPressed: onPressed,
         ),
       );
     }
    }

    class ButtonOverlay extends StatelessWidget {
     final WalkingGame game;
     const ButtonOverlay({Key? key, required this.game}) : super(key: key);
    
    
     @override
     Widget build(BuildContext context) {
       return SizedBox(
         height: MediaQuery.of(context).size.height,
         width: MediaQuery.of(context).size.width,
         child: Column(
           children: [
             Expanded(child: Container()),
             Row(
               children: [
                 Expanded(child: Container()),
                 DirectionButton(
                   iconData: Icons.arrow_drop_up,
                   onPressed: () {
                     game.direction = WalkingDirection.up;
                   },
                 ),
                 const SizedBox(height: 50, width: 50)
               ],
             ),
             Row(
               children: [
                 Expanded(child: Container()),
                 DirectionButton(
                   iconData: Icons.arrow_left,
                   onPressed: () {
                     game.direction = WalkingDirection.left;
                   },
                 ),
                 DirectionButton(
                   iconData: Icons.pause,
                   onPressed: () {
                     game.direction = WalkingDirection.idle;
                   },
                 ),
                 DirectionButton(
                   iconData: Icons.arrow_right,
                   onPressed: () {
                     game.direction = WalkingDirection.right;
                   },
                 ),
               ],
             ),
             Row(
               children: [
                 Expanded(child: Container()),
                 DirectionButton(
                   iconData: Icons.arrow_drop_down,
                   onPressed: () {
                     game.direction = WalkingDirection.down;
                   },
                 ),
                 const SizedBox(height: 50, width: 50),
               ],
             ),
           ],
         ),
       );
     }
    }

    3.3. Core logic: 

    Moving forward, we will leverage the code we have previously implemented, building upon the foundations we have laid thus far:

    3.3.1.  The first step is to create a component. As discussed earlier, all the individual elements in the game are considered components, so let’s create 1 component that will be our gaming avatar. For the UI of this avatar, we are going to use assets shown in Figure 03.

    For the avatar, we will be using SpriteAnimationComponents as we want this component to animate automatically.

    In the components directory, create a file called avatar_component.dart. This file will hold the logic of when and how our game avatar will move. 

    In the onLoad() method, we are loading the asset and using it to create animations, and in the update() method, we are using an enum to decide the walking animation.

    class AvatarComponent extends SpriteAnimationComponent with HasGameRef {
     final WalkingGame walkingGame;
     AvatarComponent({required this.walkingGame}) {
       add(RectangleHitbox());
     }
     late SpriteAnimation _downAnimation;
     late SpriteAnimation _leftAnimation;
     late SpriteAnimation _rightAnimation;
     late SpriteAnimation upAnimation;
     late SpriteAnimation _idleAnimation;
     final double _animationSpeed = .1;
    
    
     @override
     Future<void> onLoad() async {
       await super.onLoad();
    
    
       final spriteSheet = SpriteSheet(
         image: await gameRef.images.load(AssetConstants.avatarImage),
         srcSize: Vector2(2284 / 12, 1270 / 4),
       );
    
    
       _downAnimation =
           spriteSheet.createAnimation(row: 0, stepTime: _animationSpeed, to: 11);
       _leftAnimation =
           spriteSheet.createAnimation(row: 1, stepTime: _animationSpeed, to: 11);
       upAnimation =
           spriteSheet.createAnimation(row: 3, stepTime: _animationSpeed, to: 11);
       _rightAnimation =
           spriteSheet.createAnimation(row: 2, stepTime: _animationSpeed, to: 11);
       _idleAnimation =
           spriteSheet.createAnimation(row: 0, stepTime: _animationSpeed, to: 1);
       animation = _idleAnimation;
     }
    
    
     @override
     void update(double dt) {
       switch (walkingGame.direction) {
         case WalkingDirection.idle:
           animation = _idleAnimation;
           break;
         case WalkingDirection.down:
           animation = _downAnimation;
           if (y < walkingGame.mapHeight - height) {
             y += dt * walkingGame.characterSpeed;
           }
           break;
         case WalkingDirection.left:
           animation = _leftAnimation;
           if (x > 0) {
             x -= dt * walkingGame.characterSpeed;
           }
           break;
         case WalkingDirection.up:
           animation = upAnimation;
           if (y > 0) {
             y -= dt * walkingGame.characterSpeed;
           }
           break;
         case WalkingDirection.right:
           animation = _rightAnimation;
           if (x < walkingGame.mapWidth - width) {
             x += dt * walkingGame.characterSpeed;
           }
           break;
       }
       super.update(dt);
     }
    }

    3.1.2. Our Avatar is ready to walk now, but there is no map or world where he can do that. So, let’s create a game and add a background to it.  

    Create file name walking_game.dart in the lib directory and add the following code.

    class WalkingGame extends FlameGame with HasCollisionDetection {
     late double mapWidth =2520 ;
     late double mapHeight = 1300;
     WalkingDirection direction = WalkingDirection.idle;
     final double characterSpeed = 80;
     final _world = World();
    
    
     // avatar sprint
     late AvatarComponent _avatar;
    
    
     // Background image
     late SpriteComponent _background;
     final Vector2 _backgroundSize = Vector2(2520, 1300);
    
    
     // Camera Components
     late final CameraComponent _cameraComponent;
    
    
     @override
     Future<void> onLoad() async {
       await super.onLoad();
    
    
       overlays.add(KeyConstants.overlayKey);
    
    
       _background = SpriteComponent(
         sprite: Sprite(
           await images.load(AssetConstants.backgroundImage),
           srcPosition: Vector2(0, 0),
           srcSize: _backgroundSize,
         ),
         position: Vector2(0, 0),
         size: Vector2(2520, 1300),
       );
       _world.add(_background);
    
    
       _avatar = AvatarComponent(walkingGame: this)
         ..position = Vector2(529, 128)
         ..debugMode = true
         ..size = Vector2(1145 / 24, 635 / 8);
    
    
       _world.add(_avatar);
    
    
       _cameraComponent = CameraComponent(world: _world)
         ..setBounds(Rectangle.fromLTRB(390, 200, mapWidth - 390, mapHeight - 200))
         ..viewfinder.anchor = Anchor.center
         ..follow(_avatar);
    
    
       addAll([_cameraComponent, _world]);
     }
    }

    First thing in onLoad(), you can see that we are adding an overlay using a key. You can learn more about this key in the main class.

    Next is to create background components using SpriteComponent and add it to the world component. For creating the background component, we are using SpriteComponent instead of SpriteAnimationComponent because we do not need any background animation in our game.

    Then we add AvatarComponent in the same world component where we added the background component. To keep the camera fixed on the AvatarComponent, we are using 1 extra component, which is called CameraComponent.

    Lastly, we are adding both world & CameraComponents in our game by using addAll() method.

    3.1.3. Finally, we have to create the main.dart file. In this example, we are wrapping a GameWidget with MaterialApp because we want to use some features of material themes like icons, etc., in this project. If you do not want to do that, you can pass GameWidget to the runApp() method directly.
    Here we are not only adding the WalkingGame into GameWidget but also adding an overlay, which will show the control buttons. The key mentioned here for the overlay is the same key we added in walking_game.dart file’s onLoad() method.

    void main() {
     WidgetsFlutterBinding.ensureInitialized();
     Flame.device.fullScreen();
     runApp(MaterialApp(
       home: Scaffold(
         body: GameWidget(
           game: WalkingGame(),
           overlayBuilderMap: {
             KeyConstants.overlayKey: (BuildContext context, WalkingGame game) {
               return OverlayController(game: game);
             }
           },
         ),
       ),
     ));
    }

    After all this, our game will look like this, and with these 5 control buttons, we can tell your avatar to move and/or stop.

    4. Result

    For your convenience, the complete code for the project can be found here. Feel free to refer to this code repository for a comprehensive overview of the implementation details and to access the entirety of the game’s source code.

    5. Conclusion

    Flame game engine alleviates the burden of crucial tasks such as asset loading, managing refresh rates, and efficient memory management. By taking care of these essential functionalities, Flame allows developers to concentrate on implementing the core functionality and creating an exceptional game application.

    By leveraging Flame’s capabilities, you can maximize your productivity and create an amazing game application that resonates with players across various platforms, all while enjoying the benefits of a unified codebase.

    6. References

    1. https://docs.flutter.dev/
    2. https://pub.dev/packages/flame
    3. https://docs.flame-engine.org/latest
    4. https://medium.flutterdevs.com/flame-with-flutter-4c6c3bd8931c
    5. https://supabase.com/blog/flutter-real-time-multiplayer-game
    6. https://www.kodeco.com/27407121-building-games-in-flutter-with-flame-getting-started
    7. https://blog.codemagic.io/flutter-flame-game-development/
    8. https://codelabs.developers.google.com/codelabs/flutter-flame-game