Tag: flutter

  • JNIgen: Simplify Native Integration in Flutter

    Prepare to embark on a groundbreaking journey through the realms of Flutter as we uncover the remarkable new feature—JNIgen. In this blog, we pull back the curtain to reveal JNIgen’s transformative power, from simplifying intricate tasks to amplifying scalability; this blog serves as a guiding light along the path to a seamlessly integrated Flutter ecosystem.

    As Flutter continues to mesmerize developers with its constant evolution, each release unveiling a treasure trove of thrilling new features, the highly anticipated Google I/O 2023 was an extraordinary milestone. Amidst the excitement, a groundbreaking technique was unveiled: JNIgen, offering effortless access to native code like never before.

    Let this blog guide you towards a future where your Flutter projects transcend limitations and manifest into awe-inspiring creations.

    1. What is JNIgen?

    JNIgen, which stands for Java native interface generator,  is an innovative tool that automates the process of generating Dart bindings for Android APIs accessible through Java or Kotlin code. By utilizing these generated bindings, developers can invoke Android APIs with a syntax that closely resembles native code.

    With JNIgen, developers can seamlessly bridge the gap between Dart and the rich ecosystem of Android APIs. This empowers them to leverage the full spectrum of Android’s functionality, ranging from system-level operations to platform-specific features. By effortlessly integrating with Android APIs through JNIgen-generated bindings, developers can harness the power of native code and build robust applications with ease.

    1.1. Default approach: 

    In the current Flutter framework, we rely on Platform channels to establish a seamless communication channel between Dart code and native code. These channels serve as a bridge for exchanging messages and data.

    Typically, we have a Flutter app acting as the client, while the native code contains the desired methods to be executed. The Flutter app sends a message containing the method name to the native code, which then executes the requested method and sends the response back to the Flutter app.

    However, this approach requires the manual implementation of handlers on both the Dart and native code sides. It entails writing code to handle method calls and manage the exchange of responses. Additionally, developers need to carefully manage method names and channel names on both sides to ensure proper communication.

    1.2. Working principle of JNIgen: 

    Figure 1

     

    In JNIgen, our native code path is passed to the JNIgen generator, which initiates the generation of an intermediate layer of C code. This C code is followed by the necessary boilerplate in Dart, facilitating access to the C methods. All data binding and C files are automatically generated in the directory specified in the .yaml file, which we will explore shortly.

    Consequently, as a Flutter application, our interaction is solely focused on interfacing with the newly generated Dart code, eliminating the need for direct utilization of native code.

    1.3. Similar tools: 

    During the Google I/O 2023 event, JNIgen was introduced as a tool for native code integration. However, it is important to note that not all external libraries available on www.pub.dev are developed exclusively using channels. Another tool, FFIgen, was introduced earlier at Google I/O 2021 and serves a similar purpose. Both FFIgen and JNIgen function similarly, converting native code into intermediate C code with corresponding Dart dependencies to establish the necessary connections.

    While JNIgen primarily facilitates communication between Android native code and Dart code, FFIgen has become the preferred choice for establishing communication between iOS native code and Dart code. Both tools are specifically designed to convert native code into intermediate code, enabling seamless interoperability within their respective platforms.

    2. Configuration

    Prior to proceeding with the code implementation, it is essential to set up and install the necessary tools.

    2.1. System setup: 

    2.1.1 Install MVN

    Windows

    • Download the Maven archive for Windows from the link here [download Binary zip archive]
    • After Extracting the zip file, you will get a folder with name “apache-maven-x.x.x”
    • Create a new folder with the name “ApacheMaven” in “C:Program Files” and paste the above folder in it. [Your current path will be “C:Program FilesApacheMavenapache-maven-x.x.x”]
    • Add the following entry in “Environment Variable” →  “User Variables”
      M2 ⇒ “C:Program FilesApacheMavenapache-maven-x.x.xbin”
      M2_HOME ⇒ “C:Program FilesApacheMavenapache-maven-x.x.x”
    • Add a new entry “%M2_HOME%bin” in “path” variable

    Mac

    • Download Maven archive for mac from the link here [download Binary tar.gz archive]
    • Run the following command where you have downloaded the *.tar.gz file
    tar -xvf apache-maven-3.8.7.bin.tar.gz

    • Add the following entry in .zshrc or .bash_profile to set Maven path “export PATH=”$PATH:/Users/username/Downloads/apache-maven-x.x.x/bin”

    Or

    • You can use brew to install llvm 
    brew install llvm

    • Brew will give you instruction like this for further setup
    ==> llvm
    To use the bundled libc++ please add the following LDFLAGS:
    LDFLAGS="-L/opt/homebrew/opt/llvm/lib/c++ -Wl,-rpath,/opt/homebrew/opt/llvm/lib/c++"
    
    llvm is keg-only, which means it was not symlinked into /opt/homebrew,
    because macOS already provides this software and installing another version in
    parallel can cause all kinds of trouble.
    
    If you need to have llvm first in your PATH, run:
    echo 'export PATH="/opt/homebrew/opt/llvm/bin:$PATH"' >> ~/.zshrc
    
    For compilers to find llvm you may need to set:
    export LDFLAGS="-L/opt/homebrew/opt/llvm/lib"
    export CPPFLAGS="-I/opt/homebrew/opt/llvm/include"

    2.1.1 Install Clang-Format

    Windows

    • Download the latest version of LLVM for windows from the link here

    Mac

    • Run the following brew command: 
    brew install clang-format

    2.2. Flutter setup: 

    2.2.1 Get Dependencies

    Run the following commands with Flutter:

    flutter pub add jni

    flutter pub add jnigen

    2.2.2 Setup configuration file

    Figure 01 provides a visual representation of the .yaml file, which holds crucial configurations utilized by JNIgen. These configurations serve the purpose of identifying paths for native classes, as well as specifying the locations where JNIgen generates the resulting C and Dart files. Furthermore, the .yaml file allows for specifying Maven configurations, enabling the selection of specific third-party libraries that need to be downloaded to facilitate code generation.

    By leveraging the power of the .yaml file, developers gain control over the path identification process and ensure that the generated code is placed in the desired locations. Additionally, the ability to define Maven configurations grants flexibility in managing dependencies, allowing the seamless integration of required third-party libraries into the generated code. This comprehensive approach enables precise control and customization over the code generation process, enhancing the overall efficiency and effectiveness of the development workflow.

    Let’s explore the properties that we have utilized within the .yaml file (Please refer “3.2.2. code implementation” section’s example for better understanding):

    • android_sdk_config: 

    When the value of a specific property is set to “true,” it triggers the execution of a Gradle stub during the invocation of JNIgen. Additionally, it includes the Android compile classpath in the classpath of JNIgen. However, to ensure that all dependencies are cached appropriately, it is necessary to have previously performed a release build.

    • output 

    As the name implies, the “output” section defines the configuration related to the generation of intermediate code. This section plays a crucial role in determining how the intermediate code will be generated and organized.

    •  c >> library_name &&  c >> path:
      Here we are setting details for c_based binding code.

    •  dart >> path &&  dart >> structure:

    Here we are defining configuration for dart_based binding code.

    •  source_path:

    These are specific directories that are scanned during the process of locating the relevant source files.

    •  classes:

    By providing a comprehensive list of classes or packages, developers can effectively control the scope of the code generation process. This ensures that the binding code is generated only for the desired components, minimizing unnecessary code generation

    By utilizing these properties within the .yaml file, developers can effectively control various aspects of the code generation process, including path identification, code organization, and dependency management. To get more in-depth information, please check out the official documentation here.

    2.3. Generate bindings files:

    Once this setup is complete, the final step for JNIgen is to obtain the jar file that will be scanned to generate the required bindings. To initiate the process of generating the Android APK, you can execute the following command:

    flutter build apk

    Run the following command in your terminal to generate code:

    dart run jnigen --config jnigen.yaml

    2.3. Android setup: 

    Add the address of CMakeLists.txt file in your android >> app >> build.gradle file’s buildTypes section:

    buildTypes {
            externalNativeBuild {
                cmake {
                    path <address of CMakeLists.txt>
                }
            }
        }

    With this configuration, we are specifying the path for the CMake file that will been generated by JNIgen.This path declaration is crucial for identifying the location of the generated CMake file within the project structure.

    With the completion of the aforementioned steps, you are now ready to run your application and leverage all the native functions that have been integrated.

    3. Sample Project

    To gain hands-on experience and better understand the JNIgen, let’s create a small project together. Follow the steps below to get started. 

    Let’s start with:

    3.1. Packages & directories:

    3.1.1 Create a project using the following command:

    flutter create jnigen_integration_project

    3.1.2 Add these under dependencies of pubspec.yaml (and run command flutter pub get):

    jni: ^0.5.0
    jnigen: ^0.5.0

    3.1.3. Got to android >> app >> src >> main directory.

    3.1.4. Create directories inside the main as show below:

    Figure 02 

    3.2. Code Implementation:

    3.2.1 We will start with Android code. Create 2 files HardwareUtils.java & HardwareUtilsKotlin.kt inside the utils directory.

     HardwareUtilsKotlin.kt

    package com.hardware.utils
    
    import android.os.Build
    
    class HardwareUtilsKotlin {
    
       fun getHardwareDetails(): Map<String, String>? {
           val hardwareDetails: MutableMap<String, String> = HashMap()
           hardwareDetails["Language"] = "Kotlin"
           hardwareDetails["Manufacture"] = Build.MANUFACTURER
           hardwareDetails["Model No."] = Build.MODEL
           hardwareDetails["Type"] = Build.TYPE
           hardwareDetails["User"] = Build.USER
           hardwareDetails["SDK"] = Build.VERSION.SDK
           hardwareDetails["Board"] = Build.BOARD
           hardwareDetails["Version Code"] = Build.VERSION.RELEASE
           return hardwareDetails
       }
    }

     HardwareUtils.java 

    package com.hardware.utils;
    
    
    import android.os.Build;
    
    
    import java.util.HashMap;
    import java.util.Map;
    
    
    public class HardwareUtils {
    
    
       public Map<String, String> getHardwareDetails() {
           Map<String, String> hardwareDetails = new HashMap<String, String>();
           hardwareDetails.put("Language", "JAVA");
           hardwareDetails.put("Manufacture", Build.MANUFACTURER);
           hardwareDetails.put("Model No.", Build.MODEL);
           hardwareDetails.put("Type", Build.TYPE);
           hardwareDetails.put("User", Build.USER);
           hardwareDetails.put("SDK", Build.VERSION.SDK);
           hardwareDetails.put("Board", Build.BOARD);
           hardwareDetails.put("Version Code", Build.VERSION.RELEASE);
           return hardwareDetails;
       }
    
    
       public Map<String, String> getHardwareDetailsKotlin() {
           return new HardwareUtilsKotlin().getHardwareDetails();
       }
    
    
    }

    3.2.2 To provide the necessary configurations to JNIGen for code generation, we will create a .yaml file named JNIgen.yaml in the root of the project.

       jnigen.yaml 

    android_sdk_config:
     add_gradle_deps: true
    
    
    output:
     c:
       library_name: hardware_utils
       path: src/
     dart:
       path: lib/hardware_utils.dart
       structure: single_file
    
    
    source_path:
     - 'android/app/src/main/java'
    
    
    classes:
     - 'com.hardware.utils'

    3.2.3 Let’s generate C & Dart code.

    Execute the following command to create APK:

    flutter build apk

    After the successful execution of the above command, execute the following command:

    dart run jnigen --config jnigen.yaml

    3.2.4 Add the address of CMakeLists.txt in your android >> app >> build.gradle file’s buildTypes section as shown below :

    buildTypes {
            externalNativeBuild {
                cmake {
                    path "../../src/CMakeLists.txt"
                }
            }
      }

    3.2.5. Final step is to call the methods from Dart code, which was generated by JNIgen.

    To do this, replace the MyHomePage class code with the below code from main.dart file.

    class MyHomePage extends StatefulWidget {
     const MyHomePage({super.key, required this.title});
    
     final String title;
    
     @override
     State<MyHomePage> createState() => _MyHomePageState();
    }
    
    class _MyHomePageState extends State<MyHomePage> {
     String _hardwareDetails = '';
     String _hardwareDetailsKotlin = '';
     JObject activity = JObject.fromRef(Jni.getCurrentActivity());
    
     @override
     void initState() {
       JMap<JString, JString> deviceHardwareDetails =
           HardwareUtils().getHardwareDetails();
       _hardwareDetails = 'This device details from Java class:n';
       deviceHardwareDetails.forEach((key, value) {
         _hardwareDetails =
             '$_hardwareDetailsn${key.toDartString()} is ${value.toDartString()}';
       });
    
       JMap<JString, JString> deviceHardwareDetailsKotlin =
           HardwareUtils().getHardwareDetailsKotlin();
       _hardwareDetailsKotlin = 'This device details from Kotlin class:n';
       deviceHardwareDetailsKotlin.forEach((key, value) {
         _hardwareDetailsKotlin =
             '$_hardwareDetailsKotlinn${key.toDartString()} is ${value.toDartString()}';
       });
    
       setState(() {
         _hardwareDetails;
         _hardwareDetailsKotlin;
       });
       super.initState();
     }
    
     @override
     Widget build(BuildContext context) {
       return Scaffold(
         appBar: AppBar(
           title: Text(widget.title),
         ),
         body: Center(
           child: Column(
             mainAxisAlignment: MainAxisAlignment.center,
             children: <Widget>[
               Text(
                 _hardwareDetails,
                 textAlign: TextAlign.center,
               ),
               SizedBox(height: 20,),
               Text(
                 _hardwareDetailsKotlin,
                 textAlign: TextAlign.center,
               ),
             ],
           ),
         ),
       );
     }
    }

    After all of this, when we launch our app, we will see information about our Android device.

    4. Result

    For your convenience, the complete code for the project can be found here. Feel free to refer to this code repository for a comprehensive overview of the implementation details and to access the entirety of the source code.

    5. Conclusion

    In conclusion, we explored the limitations of the traditional approach to native API access in Flutter for mid to large-scale projects. Through our insightful exploration of JNIgen’s working principles, we uncovered its remarkable potential for simplifying the native integration process.

    By gaining a deep understanding of JNIgen’s inner workings, we successfully developed a sample project and provided detailed guidance on the essential setup requirements. Armed with this knowledge, developers can embrace JNIgen’s capabilities to streamline their native integration process effectively.

    We can say that JNIgen is a valuable tool for Flutter developers seeking to combine the power of Flutter’s cross-platform capabilities with the flexibility and performance benefits offered by native code. It empowers developers to build high-quality apps that seamlessly integrate platform-specific features and existing native code libraries, ultimately enhancing the overall user experience. 

    Hopefully, this blog post has inspired you to explore the immense potential of JNIgen in your Flutter applications. By harnessing the JNIgen, we can open doors to new possibilities.

    Thank you for taking the time to read through this blog!

    6. Reference

    1. https://docs.flutter.dev/
    2. https://pub.dev/packages/jnigen
    3. https://pub.dev/packages/jni
    4. https://github.com/dart-lang/jnigen
    5. https://github.com/dart-lang/jnigen#readme
    6. https://github.com/dart-lang/jnigen/wiki/Architecture-&-Design-Notes
    7. https://medium.com/simform-engineering/jnigen-an-easy-way-to-access-platform-apis-cb1fd3101e33
    8. https://medium.com/@marcoedomingos/the-ultimate-showdown-methodchannel-vs-d83135f2392d
  • Serverpod: The Ultimate Backend for Flutter

    Join us on this exhilarating journey, where we bridge the gap between frontend and backend development with the seamless integration of Serverpod and Flutter.

    Gone are the days of relying on different programming languages for frontend and backend development. With Flutter’s versatile framework, you can effortlessly create stunning user interfaces for a myriad of platforms. However, the missing piece has always been the ability to build the backend in Dart as well—until now.

    Introducing Serverpod, the missing link that completes the Flutter ecosystem. Now, with Serverpod, you can develop your entire application, from frontend to backend, all within the familiar and elegant Dart language. This synergy enables a seamless exchange of data and functions between the client and the server, reducing development complexities and boosting productivity.

    1. What is Serverpod?

    As a developer or tech enthusiast, we recognize the critical role backend services play in the success of any application. Whether you’re building a web, mobile, or desktop project, a robust backend infrastructure is the backbone that ensures seamless functionality and scalability.

    That’s where “Serverpod” comes into the picture—an innovative backend solution developed entirely in Dart, just like your Flutter projects. With Serverpod at your disposal, you can harness the full power of Dart on both the frontend and backend, creating a harmonious development environment that streamlines your workflow.

    The biggest advantage of using Serverpod is that it automates protocol and client-side code generation by analyzing your server, making remote endpoint calls as simple as local method calls.

    1.1. Current market status

    The top 10 programming languages for backend development in 2023 are as follows: 

    [Note: The results presented here are not absolute and are based on a combination of surveys conducted in 2023, including ‘Stack Overflow Developer Survey – 2023,’ ‘State of the Developer Ecosystem Survey,’ ‘New Stack Developer Survey,’ and more.]

    • Node.js – ~32%
    • Python (Django, Flask) – ~28%
    • Java (Spring Boot, Java EE) – ~18%
    • Ruby (Ruby on Rails) – ~7%
    • PHP (Laravel, Symfony) – ~6%
    • Go (Golang) – ~3%
    • .NET (C#) – ~2%
    • Rust – Approximately 1%
    • Kotlin (Spring Boot with Kotlin) – ~1%
    • Express.js (for Node.js) – ~1%
    Figure 01

    Figure 01 provides a comprehensive overview of the current usage of backend development technologies, showcasing a plethora of options with diverse features and capabilities. However, the landscape takes a different turn when it comes to frontend development. While the backend technologies offer a wealth of choices, most of these languages lack native multiplatform support for frontend applications.

    As a result, developers find themselves in a situation where they must choose between two sets of languages or technologies for backend and frontend business logic development.

    1.2. New solution

    As the demand for multiplatform applications continues to grow, developers are actively exploring new frameworks and languages that bridge the gap between backend and frontend development. Recently, a groundbreaking solution has emerged in the form of Serverpod. With Serverpod, developers can now accomplish server development in Dart, filling the crucial gap that was previously missing in the Flutter ecosystem.

    Flutter has already demonstrated its remarkable support for a wide range of platforms. The absence of server development capabilities was a notable limitation that has now been triumphantly addressed with the introduction of Serverpod. This remarkable achievement enables developers to harness the power of Dart to build both frontend and backend components, creating unified applications with a shared codebase.

    2. Configurations 

    Prior to proceeding with the code implementation, it is essential to set up and install the necessary tools.

    [Note: Given Serverpod’s initial stage, encountering errors without readily available online solutions is plausible. In such instances, seeking assistance from the Flutter community forum is highly recommended. Drawing from my experience, I suggest running the application on Flutter web first, particularly for Serverpod version 1.1.1, to ensure a smoother development process and gain insights into potential challenges.]

    2.1. Initial setup

    2.1.1 Install Docker

    Docker serves a crucial role in Serverpod, facilitating:

    • Containerization: Applications are packaged and shipped as containers, enabling seamless deployment and execution across diverse infrastructures.
    • Isolation: Applications are isolated from one another, enhancing both security and performance aspects, safeguarding against potential vulnerabilities, and optimizing system efficiency.

    Download & Install Docker from here.

    2.1.2 Install Serverpod CLI 

    • Run the following command:
    dart pub global activate serverpod_cli

    • Now test the installation by running:
    serverpod

    With proper configuration, the Serverpod command displays help information.

    2.2. Project creation

    To initiate serverpod commands, the Docker application must be launched first. Ensuring an active Docker instance in the backend environment is imperative to execute Serverpod commands successfully.

    • Create a new project with the command:
    serverpod create <your_project_name>

    Upon execution, a new directory will be generated with the specified project name, comprising three Dart packages:

    <your_project_name>_server: This package is designated for server-side code, encompassing essential components such as business logic, API endpoints, DB connections, and more.
    <your_project_name>_client: Within this package, the code responsible for server communication is auto-generated. Manual editing of files in this package is typically avoided.
    <your_project_name>_flutter: Representing the Flutter app, it comes pre-configured to seamlessly connect with your local server, ensuring seamless communication between frontend and backend elements.

    2.3. Project execution

    Step 1: Navigate to the server package with the following command:

    cd <your_project_name>/<your_project_name>_server

    Step 2: (Optional) Open the project in the VS Code IDE using the command:

    (Note: You can use any IDE you prefer, but for our purposes, we’ll use VS Code, which also simplifies DB connection later.)

    code .

    Step 3: Once the project is open in the IDE, stop any existing Docker containers with this command:

    .setup-tables.cmd

    Step 4: Before starting the server, initiate new Docker containers with the following command:

    docker-compose up --build --detach

    Step 5: The command above will start PostgreSQL and Redis containers, and you should receive the output:

    ~> docker-compose up --build --detach
    	[+] Running 2/2
     	✔ Container <your_project_name>_server-redis-1     Started                                                                                                
     	✔ Container <your_project_name>_server-postgres-1  Started

    (Note: If the output doesn’t match, refer to this Stack Overflow link for missing commands in the official documentation.)

    Step 6: Proceed to start the server with this command:

    dart bin/main.dart

    Step 7: Upon successful execution, you will receive the following output, where the “Server Default listening on port” value is crucial. Please take note of this value.

    ~> dart bin/main.dart
     	SERVERPOD version: 1.1.1, dart: 3.0.5 (stable) (Mon Jun 12 18:31:49 2023 +0000) on "windows_x64", time: 2023-07-19 15:24:27.704037Z
     	mode: development, role: monolith, logging: normal, serverId: default
     	Insights listening on port 8081
     	Server default listening on port 8080
     	Webserver listening on port 8082
     	CPU and memory usage metrics are not supported on this platform.

    Step 8: Use the “Server Default listening on port” value after “localhost,” i.e., “127.0.0.1,” and load this URL in your browser. Accessing “localhost:8080” will display the desired output, indicating that your server is running and ready to process requests.

    Figure 02

    Step 9: Now, as the containers reach the “Started” state, you can establish a connection with the database. We have opted for PostgreSQL as our DB choice, and the rationale behind this selection lies in the “docker-compose.yaml” file at the server project’s root. In the “service” section, PostgreSQL is already added, making it an ideal choice as the required setup is readily available. 

    Figure 03

    For the database setup, we need key information, such as Host, Port, Username, and Password. You can find all this vital information in the “config” directory’s “development.yaml” and “passwords.yaml” files. If you encounter difficulties locating these details, please refer to Figure 04.

    Figure 04

    Step 10: To establish the connection, you can install an application similar to Postico or, alternatively, I recommend using the MySQL extension, which can be installed in your VS Code with just one click.

    Figure 05

    Step 11: Follow these next steps:

    1. Select the “Database” option.
    2. Click on “Create Connection.”
    3. Choose the “PostgreSQL” option.
    4. Add a name for your Connection.
    5. Fill in the information collected in the last step.
    6. Finally, select the “Connect” option.
    Figure 06
    1. Upon success, you will receive a “Connect Success!” message, and the new connection will be added to the Explorer Tab.
    Figure 07

    Step 12: Now, we shift our focus to the Flutter project (Frontend):

    Thus far, we have been working on the server project. Let us open a new VS Code instance for a separate Flutter project while keeping the current VS Code instance active, serving as the server.

    Step 13: Execute the following command to run the Flutter project on Chrome:

    flutter run -d chrome

    With this, the default project will generate the following output:

    Step 14: When you are finished, you can shut down Serverpod with “Ctrl-C.”

    Step 15: Then stop Postgres and Redis.

    docker compose stop

    Figure 08

    3. Sample Project

    So far, we have successfully created and executed the project, identifying three distinct components. The server project caters to server/backend development, while the Flutter project handles application/frontend development. The client project, automatically generated, serves as the vital intermediary, bridging the gap between the frontend and backend.

    However, merely acknowledging the projects’ existence is insufficient. To maximize our proficiency, it is crucial to grasp the code and file structure comprehensively. To achieve this, we will embark on a practical journey, constructing a small project to gain hands-on experience and unlock deeper insights into all three components. This approach empowers us with a well-rounded understanding, further enhancing our capabilities in building remarkable applications.

    3.1. What are we building?

    In this blog, we will construct a sample project with basic Login and SignUp functionality. The SignUp process will collect user information such as Email, Password, Username, and age. Users can subsequently log in using their email and password, leading to the display of user details on the dashboard screen. With the initial system setup complete and the newly created project up and running, it’s time to commence coding. 

    3.1.1 Create custom models for API endpoints

    Step1: Create a new file in the “lib >> src >> protocol” directory named “users.yaml”:

    class: Users
    table: users
    fields:
      username: String
      email: String
      password: String
      age: int

    Step 2: Save the file and run the following command to generate essential data classes and table creation queries:

    serverpod generate

    (Note: Add “–watch” after the command for continuous code generation). 

    Successful execution of the above command will generate a new file named “users.dart” in the “lib >> src >> generated” folder. Additionally, the “tables.pgsql” file now contains SQL queries for creating the “users” table. The same command updates the auto-generated code in the client project. 

    3.1.2 Create Tables in DB for the generated model 

    Step 1: Copy the queries written in the “generated >> tables.pgsql” file.

    In the MySQL Extension’s Database section, select the created database >> [project_name] >> public >> Tables >> + (Create New Table).

    Figure 09

    Step 2: Paste the queries into the newly created .sql file and click “Execute” above both queries.

    Figure 10

    Step 3: After execution, you will obtain an empty table with the “id” as the Primary key.

    Figure 11

    If you found multiple tables already present in your database like shown in the next figure, you can ignore those. These tables are created by the system with queries present in the “generated >> tables-serverpod.pgsql” file.

    Figure 12

    3.1.3 Create an API endpoint

    Step 1: Generate a new file in the “lib >> src >> endpoints” directory named “session_endpoints.dart”:

    class SessionEndpoint extends Endpoint {
      Future<Users?> login(Session session, String email, String password) async {
        List<Users> userList = await Users.find(session,
            where: (p0) =>
                (p0.email.equals(email)) & (p0.password.equals(password)));
        return userList.isEmpty ? null : userList[0];
      }
    
    
      Future<bool> signUp(Session session, Users newUser) async {
        try {
          await Users.insert(session, newUser);
          return true;
        } catch (e) {
          print(e.toString());
          return false;
        }
      }
    }

    If “serverpod generate –watch” is already running, you can ignore this step 2.

    Step 2: Run the command:

    serverpod generate

    Step 3: Start the server.
    [For help, check out Step 1 Step 6 mentioned in Project Execution part.]

    3.1.3 Create three screens

    Login Screen:

    Figure 13

    SignUp Screen:

    Figure 14

    Dashboard Screen:

    Figure 15

    3.1.4 Setup Flutter code

    Step 1: Add the code provided to the SignUp button in the SignUp screen to handle user signups.

    try {
            final result = await client.session.signUp(
              Users(
                email: _emailEditingController.text.trim(),
                username: _usernameEditingController.text.trim(),
                password: _passwordEditingController.text.trim(),
                age: int.parse(_ageEditingController.text.trim()),
              ),
            );
            if (result) {
              Navigator.pop(context);
            } else {
              _errorText = 'Something went wrong, Try again.';
            }
          } catch (e) {
            debugPrint(e.toString());
            _errorText = e.toString();
          }

    Step 2: Add the code provided to the Login button in the Login screen to handle user logins.

    try {
            final result = await client.session.login(
              _emailEditingController.text.trim(),
              _passwordEditingController.text.trim(),
            );
            if (result != null) {
              _emailEditingController.text = '';
              _passwordEditingController.text = '';
              Navigator.push(
                context,
                MaterialPageRoute(
                  builder: (context) => DashboardPage(user: result),
                ),
              );
            } else {
              _errorText = 'Something went wrong, Try again.';
            }
          } catch (e) {
            debugPrint(e.toString());
            _errorText = e.toString();
          }

    Step 3: Implement logic to display user data on the dashboard screen.

    With these steps completed, our Flutter app becomes a fully functional project, showcasing the power of this new technology. Armed with Dart knowledge, every Flutter developer can transform into a proficient full-stack developer.

    4. Result

    Figure 16

    To facilitate your exploration, the entire project code is conveniently available in this code repository. Feel free to refer to this repository for an in-depth understanding of the implementation details and access to the complete source code, enabling you to delve deeper into the project’s intricacies and leverage its functionalities effectively.

    5. Conclusion

    In conclusion, we have provided a comprehensive walkthrough of the step-by-step setup process for running Serverpod seamlessly. We explored creating data models, integrating the database with our server project, defining tables, executing data operations, and establishing accessible API endpoints for Flutter applications.

    Hopefully, this blog post has kindled your curiosity to delve deeper into Serverpod’s immense potential for elevating your Flutter applications. Embracing Serverpod unlocks a world of boundless possibilities, empowering you to achieve remarkable feats in your development endeavors.

    Thank you for investing your time in reading this informative blog!

    6. References

    1. https://docs.flutter.dev/
    2. https://pub.dev/packages/serverpod/
    3. https://serverpod.dev/
    4. https://docs.docker.com/get-docker/
    5. https://medium.com/serverpod/introducing-serverpod-a-complete-backend-for-flutter-written-in-dart-f348de228e19
    6. https://medium.com/serverpod/serverpod-our-vision-for-a-seamless-scalable-backend-for-the-flutter-community-24ba311b306b
    7. https://stackoverflow.com/questions/76180598/serverpod-sql-error-when-starting-a-clean-project
    8. https://www.youtube.com/watch?v=3Q2vKGacfh0
    9. https://www.youtube.com/watch?v=8sCxWBWhm2Y

  • Integrating Augmented Reality in a Flutter App to Enhance User Experience

    In recent years, augmented reality (AR) has emerged as a cutting-edge technology that has revolutionized various industries, including gaming, retail, education, and healthcare. Its ability to blend digital information with the real world has opened up a new realm of possibilities. One exciting application of AR is integrating it into mobile apps to enhance the user experience.

    In this blog post, we will explore how to leverage Flutter, a powerful cross-platform framework, to integrate augmented reality features into mobile apps and elevate the user experience to new heights.

    Understanding Augmented Reality:‍

    Before we dive into the integration process, let’s briefly understand what augmented reality is. Augmented reality is a technology that overlays computer-generated content onto the real world, enhancing the user’s perception and interaction with their environment. Unlike virtual reality (VR), which creates a fully simulated environment, AR enhances the real world by adding digital elements such as images, videos, and 3D models.

    The applications of augmented reality are vast and span across different industries. In gaming, AR has transformed mobile experiences by overlaying virtual characters and objects onto the real world. It has also found applications in areas such as marketing and advertising, where brands can create interactive campaigns by projecting virtual content onto physical objects or locations. AR has also revolutionized education by offering immersive learning experiences, allowing students to visualize complex concepts and interact with virtual models.

    In the upcoming sections, we will explore the steps to integrate augmented reality features into mobile apps using Flutter.

    ‍What is Flutter?‍

    Flutter is an open-source UI (user interface) toolkit developed by Google for building natively compiled applications for mobile, web, and desktop platforms from a single codebase. It allows developers to create visually appealing and high-performance applications with a reactive and customizable user interface.

    The core language used in Flutter is Dart, which is also developed by Google. Dart is a statically typed, object-oriented programming language that comes with modern features and syntax. It is designed to be easy to learn and offers features like just-in-time (JIT) compilation during development and ahead-of-time (AOT) compilation for optimized performance in production.

    Flutter provides a rich set of customizable UI widgets that enable developers to build beautiful and responsive user interfaces. These widgets can be composed and combined to create complex layouts and interactions, giving developers full control over the app’s appearance and behavior.

    Why Choose Flutter for AR Integration?

    Flutter, backed by Google, is a versatile framework that enables developers to build beautiful and performant cross-platform applications. Its rich set of UI components and fast development cycle make it an excellent choice for integrating augmented reality features. By using Flutter, developers can write a single codebase that runs seamlessly on both Android and iOS platforms, saving time and effort.

    Flutter’s cross-platform capabilities enable developers to write code once and deploy it on multiple platforms, including iOS, Android, web, and even desktop (Windows, macOS, and Linux).

    The Flutter ecosystem is supported by a vibrant community, offering a wide range of packages and plugins that extend its capabilities. These packages cover various functionalities such as networking, database integration, state management, and more, making it easy to add complex features to your Flutter applications.’

    ‍Steps to Integrate AR in a Flutter App:

    Step 1: Set Up Flutter Project:

    Assuming that you already have Flutter installed in your system, create a new Flutter project or open an existing one to start integrating AR features. If not, then follow this https://docs.flutter.dev/get-started/install to set up Flutter.

    Step 2: Add ar_flutter_plugin dependency:

    Update the pubspec.yaml file of your Flutter project and add the following line under the dependencies section:

    dependencies:
    ar_flutter_plugin: ^0.7.3.

    This step ensures that your Flutter project has the necessary dependencies to integrate augmented reality using the ar_flutter_plugin package.
    Run `flutter pub get` to fetch the package.

    Step 3: Initializing the AR View:

    Create a new Dart file for the AR screen. Import the required packages at the top of the file:

    Define a new class called ARScreen that extends StatefulWidget and State. This class represents the AR screen and handles the initialization and rendering of the AR view:

    class ArScreen extends StatefulWidget {  
      const ArScreen({Key? key}) : super(key: key);  
      @override  
      _ArScreenState createState() => _ArScreenState();
    }‍
      class _ArScreenState extends State<ArScreen> {  
      ARSessionManager? arSessionManager;  
      ARObjectManager? arObjectManager;  
      ARAnchorManager? arAnchorManager;‍  
        List<ARNode> nodes = [];  
      List<ARAnchor> anchors = [];‍  
        @override  
        void dispose() {    
        super.dispose();    
        arSessionManager!.dispose();  }‍  
        @override  
        Widget build(BuildContext context) {    
        return Scaffold(        
          appBar: AppBar(          
            title: const Text('Anchors & Objects on Planes'),        
          ),        
          body: Stack(children: [          
            ARView(        
              onARViewCreated: onARViewCreated,        
              planeDetectionConfig: PlaneDetectionConfig.horizontalAndVertical,          
            ),          
            Align(        
              alignment: FractionalOffset.bottomCenter,        
              child: Row(            
                mainAxisAlignment: MainAxisAlignment.spaceEvenly,            
                children: [              
                  ElevatedButton(                  
                    onPressed: onRemoveEverything,                  
                    child: const Text("Remove Everything")),            
                ]),          
            )        
          ]));  
      }

    Step 4: Add AR functionality:

    Create a method onARViewCreated for the onArCoreViewCreated callback. You can add “required” AR functionality in this method, such as loading 3D models or handling interactions. In our demo, we will be adding 3D models in AR on tap:

    void onARViewCreated(
          ARSessionManager arSessionManager,
          ARObjectManager arObjectManager,
          ARAnchorManager arAnchorManager,
          ARLocationManager arLocationManager) {
        this.arSessionManager = arSessionManager;
        this.arObjectManager = arObjectManager;
        this.arAnchorManager = arAnchorManager;
    
        this.arSessionManager!.onInitialize(
              showFeaturePoints: false,
              showPlanes: true,
              customPlaneTexturePath: "Images/triangle.png",
              showWorldOrigin: true,
            );
        this.arObjectManager!.onInitialize();
    
        this.arSessionManager!.onPlaneOrPointTap = onPlaneOrPointTapped;
        this.arObjectManager!.onNodeTap = onNodeTapped;
      }

    After this, create a method onPlaneOrPointTapped for handling interactions.

    Future<void> onPlaneOrPointTapped(
          List<ARHitTestResult> hitTestResults) async {
        var singleHitTestResult = hitTestResults.firstWhere(
            (hitTestResult) => hitTestResult.type == ARHitTestResultType.plane);
        var newAnchor =
            ARPlaneAnchor(transformation: singleHitTestResult.worldTransform);
        bool? didAddAnchor = await arAnchorManager!.addAnchor(newAnchor);
        if (didAddAnchor!) {
          anchors.add(newAnchor);
          // Add note to anchor
          var newNode = ARNode(
              type: NodeType.webGLB,
              uri:
    "https://github.com/KhronosGroup/glTF-Sample-Models/raw/master/2.0/Duck/glTF-Binary/Duck.glb",
              scale: Vector3(0.2, 0.2, 0.2),
              position: Vector3(0.0, 0.0, 0.0),
              rotation: Vector4(1.0, 0.0, 0.0, 0.0));
          bool? didAddNodeToAnchor = await arObjectManager!
              .addNode(newNode, planeAnchor: newAnchor);
          if (didAddNodeToAnchor!) {
            nodes.add(newNode);
          } else {
            arSessionManager!.onError("Adding Node to Anchor failed");
          }
        } else {
          arSessionManager!.onError("Adding Anchor failed");
        }
      }

    Finally, create a method for onRemoveEverything to remove all the elements on the screen.

    Future<void> onRemoveEverything() async {
           for (var anchor in anchors) {
          arAnchorManager!.removeAnchor(anchor);
        }
        anchors = [];
      }

    Step 5: Run the AR screen:

    In your app’s main entry point, set the ARScreen as the home screen:

    void main() {
      runApp(MyApp());
    }
    
    class MyApp extends StatelessWidget {
      @override
      Widget build(BuildContext context) {
        return MaterialApp(
          home: ARScreen(),
        );
      }
    }

    In the example below, we can observe the AR functionality implemented. We are loading a Duck 3D Model whenever the user taps on the screen. The plane is auto-detected, and once that is done, we can add a model to it. We also have a floating button to remove everything that is on the plane at the given moment.

    ‍Benefits of AR Integration:

    • Immersive User Experience: Augmented reality adds an extra dimension to user interactions, creating immersive and captivating experiences. Users can explore virtual objects within their real environment, leading to increased engagement and satisfaction.
    • Interactive Product Visualization: AR allows users to visualize products in real-world settings before making a purchase. They can view how furniture fits in their living space, try on virtual clothes, or preview architectural designs. This interactive visualization enhances decision-making and improves customer satisfaction.
    • Gamification and Entertainment: Augmented reality opens up opportunities for gamification and entertainment within apps. You can develop AR games, quizzes, or interactive storytelling experiences, providing users with unique and enjoyable content.
    • Marketing and Branding: By incorporating AR into your Flutter app, you can create innovative marketing campaigns and branding experiences. AR-powered product demonstrations, virtual try-ons, or virtual showrooms help generate excitement around your brand and products.

    Conclusion:

    Integrating augmented reality into a Flutter app brings a new level of interactivity and immersion to the user experience. Flutter’s versatility with AR frameworks like ARCore and ARKit, empowers developers to create captivating and innovative mobile applications. By following the steps outlined in this blog post, you can unlock the potential of augmented reality and deliver exceptional user experiences that delight and engage your audience. Embrace the possibilities of AR in Flutter and embark on a journey of exciting and immersive app development.

  • What’s New with Material 3 in Flutter: Discussing the Key Updates with an Example

    At Google I/O 2021, Google unveiled Material You, the next evolution of Material Design, along with Android 12. This update introduced Material Design 3 (M3), bringing a host of significant changes and improvements to the Material Design system. For Flutter developers, adopting Material 3 offers a seamless and consistent design experience across multiple platforms. In this article, we will delve into the key changes of Material 3 in Flutter and explore how it enhances the app development process.

    1. Dynamic Color:‍

    One of the notable features of Material 3 is dynamic color, which enables developers to apply consistent colors throughout their apps. By leveraging the Material Theme Builder web app or the Figma plugin, developers can visualize and create custom color schemes based on a given seed color. The dynamic color system ensures that colors from different tonal palettes are applied consistently across the UI, resulting in a harmonious visual experience.

    2. Typography:‍

    Material 3 simplifies typography by categorizing it into five key groups: Display, Headline, Title, Body, and Label. This categorization makes using different sizes within each group easier, catering to devices with varying screen sizes. The scaling of typography has also become consistent across the groups, offering a more streamlined and cohesive approach to implementing typography in Flutter apps.

    3. Shapes:‍

    Material 3 introduces a wider range of shapes, including squared, rounded, and rounded rectangular shapes. Previously circular elements, such as the FloatingActionButton (FAB), have now transitioned to a rounded rectangular shape. Additionally, widgets like Card, Dialog, and BottomSheet feature a more rounded appearance in Material 3. These shape enhancements give developers more flexibility in designing visually appealing and modern-looking user interfaces.

    4. Elevation:‍

    In Material Design 2, elevated components had shadows that varied based on their elevation values. Material 3 takes this a step further by introducing the surfaceTintColor color property. This property applies a color to the surface of elevated components, with the intensity varying based on the elevation value. By incorporating surfaceTintColor, elevated components remain visually distinguishable even without shadows, resulting in a more polished and consistent UI.

    Let’s go through each of them in detail.

    Dynamic Color

    Dynamic color in Flutter enables you to apply consistent colors throughout your app. It includes key and neutral colors from different tonal palettes, ensuring a harmonious UI experience. You can use tools like Material Theme Builder or Figma plugin to create a custom color scheme to visualize and generate dynamic colors. By providing a seed color in your app’s theme, you can easily create an M3 ColorScheme. For example, adding “colorSchemeSeed: Colors.green” to your app will result in a lighter green color for elements like the FloatingActionButton (FAB), providing a customized look for your app.

    // primarySwatch: Colors.blue,  
     useMaterial3: true,  
     colorSchemeSeed: Colors.green,
     ),

    Note:
    When using the colorSchemeSeed in Flutter, it’s important to note that if you have already defined a primarySwatch in your app’s theme, you may encounter an assertion error. The error occurs because colorSchemeSeed and primarySwatch should not be used together. To avoid this issue, ensure that you either remove the primarySwatch or set colorSchemeSeed to null when using the colorSchemeSeed feature in your Flutter app.

    Using Material 3

    Typography

    In Material 3, the naming of typography has been made simpler by dividing it into five main groups: 

    1. Display 
    2. Headline 
    3. Title 
    4. Body 
    5. Label

    Each group has a more descriptive role, making it easier to use different font sizes within a specific typography group. For example, instead of using names like bodyText1, bodyText2, and caption, Material 3 introduces names like BodyLarge, BodyMedium, and BodySmall. This improved naming system is particularly helpful when designing typography for devices with varying screen sizes.

    Shapes

    Material 3 introduces an expanded selection of shapes, including square, rounded, and rounded rectangular shapes. The Floating Action Button (FAB), which used to be circular, now has a rounded rectangular shape. Material buttons have transitioned from rounded rectangular to pill-shaped. Additionally, widgets such as Card, Dialog, and BottomSheet have adopted a more rounded appearance in Material 3.

    Elevation

    In Material 2, elevated components were accompanied by shadows, with the size of the shadow increasing as the elevation increased. Material 3 brings a new feature called surfaceTintColor. When applied to elevated components, the surface of these components takes on the specified color, with the intensity varying based on the elevation value. This property is now available for all elevated widgets in Flutter, alongside elevation and shadow properties.

    Here’s an example Flutter app that demonstrates the key changes in Material 3 regarding dynamic color, typography, shapes, and elevation. This example app includes a simple screen with a colored container and text, showcasing the usage of these new features:

    //main.dart
    import 'package:flutter/material.dart';
    void main() {
      runApp(MyApp());
    }
    class MyApp extends StatelessWidget {
      @override
      Widget build(BuildContext context) {
        return MaterialApp(
          debugShowCheckedModeBanner: false,
          theme: ThemeData(
            useMaterial3: true,
            colorSchemeSeed: Colors.green,
          ),
          home: const MyHomePage(),
        );
      }
    }
    class MyHomePage extends StatelessWidget {
      const MyHomePage({Key? key}) : super(key: key);
      @override
      Widget build(BuildContext context) {
        return Scaffold(
          appBar: AppBar(
            title: Text(
              'Material 3 Key Changes',
              style: Theme.of(context).textTheme.headlineSmall,
            ),
            elevation: 8,
            shadowColor: Theme.of(context).shadowColor,
          ),
          body: Container(
            width: double.infinity,
            height: 200,
            color: Theme.of(context).colorScheme.secondary,
            padding: const EdgeInsets.all(16.0),
            child: Center(
              child: Text(
                'Hello, Material 3!',
                style: Theme.of(context).textTheme.bodyLarge?.copyWith(
                      color: Colors.white,
                    ),
              ),
            ),
          ),
          floatingActionButton: FloatingActionButton(
            onPressed: () {},
            child: const Icon(Icons.add),
          ),
        );
      }
    }

    Conclusion:

    Material 3 represents a significant update to the Material Design system in Flutter, offering developers a more streamlined and consistent approach to app design. The dynamic color feature allows for consistent colors throughout the UI, while the simplified typography and expanded shape options provide greater flexibility in creating visually engaging interfaces. Moreover, the enhancements in elevation ensure a cohesive and polished look for elevated components.

    As Flutter continues to evolve and adapt to Material 3, developers can embrace these key changes to create beautiful, personalized, and accessible designs across different platforms. The Flutter team has been diligently working to provide full support for Material 3, enabling developers to migrate their existing Material 2 apps seamlessly. By staying up to date with the progress of Material 3 implementation in Flutter, developers can leverage its features to enhance their app development process and deliver exceptional user experiences.

    Remember, Material 3 is an exciting opportunity for Flutter developers to create consistent and unified UI experiences, and exploring its key changes opens up new possibilities for app design.

  • Flame Engine : Unleashing Flutter’s Game Development Potential

    With Flutter, developers can leverage a single codebase to seamlessly build applications for diverse platforms, including Android, iOS, Linux, macOS, Windows, Google Fuchsia, and the web. The Flutter team remains dedicated to empowering developers of all backgrounds, ensuring effortless creation and publication of applications using this powerful multi-platform UI toolkit.
    Flutter simplifies the process of developing standard applications effortlessly. However, if your aim is to craft an extraordinary game with stunning graphics, captivating gameplay, lightning-fast loading times, and highly responsive interactions, Flames emerges as the perfect solution.s
    This blog will provide you with an in-depth understanding of Flame. Through the features provided by Flame, you will embark on a journey to master the art of building a Flutter game from the ground up. You will gain invaluable insights into seamlessly integrating animations, configuring immersive soundscapes, and efficiently managing diverse game assets.

    1. Flame engine

    Flame is a cutting-edge 2D modular game engine designed to provide a comprehensive suite of specialized solutions for game development. Leveraging the powerful architecture of Flutter, Flame significantly simplifies the coding process, empowering you to create remarkable projects with efficiency and precision.

    1.1. Setup: 

    Run this command with Flutter:

    $ flutter pub add flame

    This will add a line like this to your package’s pubspec.yaml (and run an implicit flutter pub get):

    Dependencies:
    Flame: ^1.8.1

    Import it, and now, in your Dart code, you can use:

    import 'package:flame/flame.dart';

    1.2. Assets Structure: 

    Flame introduces a well-structured assets directory framework, enabling seamless utilization of these resources within your projects.
    To illustrate the concepts further, let’s delve into a practical example that showcases the application of the discussed principles:

    Flame.images.load(‘card_sprites.png');  	
    FlameAudio.play('shuffling.mp3');

    When utilizing image and audio assets in Flame, you can simply specify the asset name without the need for the full path, given that you place the assets within the suggested directories as outlined below.

    For better organization, you have the option to divide your audio folder into two distinct subfolders: music and sfx

    The music directory is intended for audio files used as background music, while the sfx directory is specifically designated for sound effects, encompassing shots, hits, splashes, menu sounds, and more.

    To properly configure your project, it is crucial to include the entry of above-mentioned directories in your pubspec.yaml file:

    1.3. Support to other platforms: 

    As Flame is built upon the robust foundation of Flutter, its platform support is inherently reliant on Flutter’s compatibility with various platforms. Therefore, the range of platforms supported by Flame is contingent upon Flutter’s own platform support.

    Presently, Flame offers extensive support for desktop platforms such as Windows, MacOS, and Linux, in addition to mobile platforms, including Android and iOS. Furthermore, Flame also facilitates game development for the web. It is important to note that Flame primarily focuses on stable channel support, ensuring a reliable and robust experience. While Flame may not provide direct assistance for the dev, beta, and master channels, it is expected that Flame should function effectively in these environments as well.

    1.3.1. Flutter web: 

    To optimize the performance of your web-based game developed with Flame, it is recommended to ensure that your game is utilizing the CanvasKit/Skia renderer. By leveraging the canvas element instead of separate DOM elements, this choice enhances web performance significantly. Therefore, incorporating the CanvasKit/Skia renderer within your Flame-powered game is instrumental in achieving optimal performance on the web platform.

    To run your game using Skia, use the following command:

    flutter run -d chrome --web-renderer canvaskit

    To build the game for production, using Skia, use the following:

    flutter build web --release --web-renderer canvaskit

    2. Implementation

    2.1 GameWidget: 

    To integrate a Game instance into the Flutter widget tree, the recommended approach is to utilize the GameWidget. This widget serves as the root of your game application, enabling seamless integration of your game. You can incorporate a Game instance into the widget tree by following the example provided below:

    void main() {
      runApp(
        GameWidget(game: MyGame()),
      );
    }

    By adopting this approach, you can effectively add your Game instance to the Flutter widget tree, ensuring proper execution and integration of your game within the Flutter application structure.

    2.2 GameWidget:

    When developing games in Flutter, it is crucial to utilize a widget that can efficiently handle high refresh rates, speedy memory allocation, and deallocation and provide enhanced functionality compared to the Stateless and Stateful widgets. Flame offers the FlameGame class, which excels in providing these capabilities.

    By utilizing the FlameGame class, you can create games by adding components to it. This class automatically calls the update and render methods of all the components added to it. Components can be directly added to the FlameGame through the constructor using the named children argument, or they can be added from anywhere else using the add or addAll methods.

    To incorporate the FlameGame into the widget tree, you need to pass its object to the GameWidget. Refer to the example below for clarification:

    class CardMatchGame extends FlameGame {
      @override
      Future<void> onLoad() async {
        await add(CardTable());
      }
    }
    
    main() {
      final cardMatchGame = CardMatchGame(children: [CardTable]);
      runApp(
        GameWidget(
          game: cardMatchGame,
        ),
      );
    }

    2.3 Component:

    This is the last piece of the puzzle. The smallest individual components that make up the game. This is like a widget but within the game. All components can have other components as children, and all components inherit from the abstract class Component. These components serve as the fundamental entities responsible for rendering and interactivity within the game, and their hierarchical organization allows for flexible and modular construction of complex game systems in Flame. These components have their own lifecycle. 

    Component Lifecycle: 

     

    Figure 01

    2.3.1. onLoad:

    The onLoad method serves as a crucial component within the game’s lifecycle, allowing for the execution of asynchronous operations such as image loading. Positioned between the onGameResize and onMount callbacks, this method is strategically placed to ensure the necessary assets are loaded and prepared. In Figure 01 of the component lifecycle, onLoad is set as the initial method due to its one-time execution. It is within this method that all essential assets, including images, audio files, and tmx files, should be loaded. This ensures that these assets are readily available for utilization throughout the game’s progression.

    2.3.2. onGameResize:

    Invoked when new components are added to the component tree or when the screen undergoes resizing, the onGameResize method plays a vital role in handling these events. It is executed before the onMount callback, allowing for necessary adjustments to be made in response to changes in component structure or screen dimensions.

    2.3.3. onParentResize:

    This method is triggered when the parent component undergoes a change in size or whenever the current component is mounted within the component tree. By leveraging the onParentResize callback, developers can implement logic that responds to parent-level resizing events and ensures the proper rendering and positioning of the component.

    2.3.4. onMount:

    As the name suggests, the onMount method is executed each time a component is mounted into the game tree. This critical method offers an opportunity to initialize the component and perform any necessary setup tasks before it becomes an active part of the game.

    2.3.5. onRemove:

    The onRemove method facilitates the execution of code just before a component is removed from the game tree. Regardless of whether the component is removed using the parent’s remove method or the component’s own remove method, this method ensures that the necessary cleanup actions take place in a single execution.

    2.3.6. onChildrenChanged:

    The onChildrenChanged method is triggered whenever a change occurs in a child component. Whether a child is added or removed, this method provides an opportunity to handle the updates and react accordingly, ensuring the parent component remains synchronized with any changes in its children.

    2.3.7. Render & Update Loop:

    The Render method is responsible for generating the user interface, utilizing the available data to create the game screen. It provides developers with canvas objects, allowing them to draw the game’s visual elements. On the other hand, the Update method is responsible for modifying and updating this rendered UI. Changes such as resizing, repositioning, or altering the appearance of components are managed through the Update method. In essence, any changes observed in the size or position of a component can be attributed to the Update method, which ensures the dynamic nature of the game’s user interface.

    3. Sample Project

    To showcase the practical implementation of key classes like GameWidget, FlameGame, and essential Components within the Flame game engine, we will embark on the creation of a captivating action game. By engaging in this hands-on exercise, you will gain valuable insights and hands-on experience in utilizing Flame’s core functionalities and developing compelling games. Through this guided journey, you will unlock the knowledge and skills necessary to create engaging and immersive gaming experiences, while harnessing the power of Flame’s robust framework.

    Let’s start with:

    3.1. Packages & assets: 

    3.1.1. Create a project using the following command:

    flutter create flutter_game_poc

    3.1.2. Add these under dependencies of pubspec.yaml (and run command flutter pub get):

    flame: ^1.8.0

    3.1.3. As mentioned earlier in the Asset Structure section, let’s create a directory called assets in your project and include an images subdirectory within it. Download assets from here, add both the assets to this  images directory.

    Figure 02 

    Figure 03

    In our game, we’ll use “Figure 02” as the background image and “Figure 03” as the avatar character who will be walking. If you have separate images for the avatar’s different walking frames, you can utilize a sprite generator tool to create a sprite sheet from those individual images.

    A sprite generator helps combine multiple separate images into a single sprite sheet, which enables efficient rendering and animation of the character in the game. You can find various sprite generator tools available online that can assist in generating a sprite sheet from your separate avatar images.

    By using a sprite sheet, you can easily manage and animate the character’s walking motion within the game, providing a smooth and visually appealing experience for the players.

    After uploading, your asset structure will look like this: 

    Figure 04

    3.1.4. To use these assets, we have to register them into pubspect.yaml below assets section: 

    assets: 
       -  assets/images/

    3.2. Supporting code: 

    3.2.1. Create 3 directories  constants, overlays, and components inside the lib directory.

    3.2.2. First, we will start with a constants directory where we have to create 4 files as follows:

       all_constants.dart. 

    export 'asset _constants.dart';
    export 'enum_ constants.dart';
    export 'key_constants.dart';

       assets_constants.dart. 

    class AssetConstants {
     static String backgroundImage = 'background.png';
     static String avatarImage = 'avatar_sprite.png';
    }

       enum_constants.dart. 

    enum WalkingDirection {idle, up, down, left, right};

       key_constants.dart. 

    class KeyConstants {
     static String overlayKey = 'DIRECTION_BUTTON';
    }

    3.2.3. In addition to the assets directory, we will create an overlay directory to include elements that need to be constantly visible to the user during the game. These elements typically include information such as the score, health, or action buttons.

    For our game, we will incorporate five control buttons that allow us to direct the gaming avatar’s movements. These buttons will remain visible on the screen at all times, facilitating player interaction and guiding the avatar’s actions within the game environment.

    Organizing these overlay elements in a separate directory makes it easier to manage and update the user interface components that provide vital information and interaction options to the player while the game is in progress.

    In order to effectively manage and control the position of all overlay widgets within our game, let’s create a dedicated controller. This controller will serve as a centralized entity responsible for orchestrating the placement and behavior of these overlay elements. Create a file named  overlay_controller.dart.

    All the files in the overlays directory are common widgets that extend Stateless widget.

    class OverlayController extends StatelessWidget {
     final WalkingGame game;
     const OverlayController({super.key, required this.game});
    
    
     @override
     Widget build(BuildContext context) {
       return Column(children: [
         Row(children: [ButtonOverlay(game: game)])
       ]);
     }
    }

    3.2.5. In our game, all control buttons share a common design, featuring distinct icons and functionalities. To streamline the development process and maintain a consistent user interface, we will create a versatile widget called DirectionButton. This custom widget will handle the uniform UI design for all control buttons.

    Inside the overlays directory, create a directory called widgets and add a file called direction_button.dart in that directory. This file defines the shape and color of all control buttons. 

    class DirectionButton extends StatelessWidget {
     final IconData iconData;
     final VoidCallback onPressed;
    
    
     const DirectionButton(
         {super.key, required this.iconData, required this.onPressed});
    
    
     @override
     Widget build(BuildContext context) {
       return Container(
         height: 40,
         width: 40,
         margin: const EdgeInsets.all(4),
         decoration: const BoxDecoration(
             color: Colors.black45,
             borderRadius: BorderRadius.all(Radius.circular(10))),
         child: IconButton(
           icon: Icon(iconData),
           iconSize: 20,
           color: Colors.white,
           onPressed: onPressed,
         ),
       );
     }
    }

    class ButtonOverlay extends StatelessWidget {
     final WalkingGame game;
     const ButtonOverlay({Key? key, required this.game}) : super(key: key);
    
    
     @override
     Widget build(BuildContext context) {
       return SizedBox(
         height: MediaQuery.of(context).size.height,
         width: MediaQuery.of(context).size.width,
         child: Column(
           children: [
             Expanded(child: Container()),
             Row(
               children: [
                 Expanded(child: Container()),
                 DirectionButton(
                   iconData: Icons.arrow_drop_up,
                   onPressed: () {
                     game.direction = WalkingDirection.up;
                   },
                 ),
                 const SizedBox(height: 50, width: 50)
               ],
             ),
             Row(
               children: [
                 Expanded(child: Container()),
                 DirectionButton(
                   iconData: Icons.arrow_left,
                   onPressed: () {
                     game.direction = WalkingDirection.left;
                   },
                 ),
                 DirectionButton(
                   iconData: Icons.pause,
                   onPressed: () {
                     game.direction = WalkingDirection.idle;
                   },
                 ),
                 DirectionButton(
                   iconData: Icons.arrow_right,
                   onPressed: () {
                     game.direction = WalkingDirection.right;
                   },
                 ),
               ],
             ),
             Row(
               children: [
                 Expanded(child: Container()),
                 DirectionButton(
                   iconData: Icons.arrow_drop_down,
                   onPressed: () {
                     game.direction = WalkingDirection.down;
                   },
                 ),
                 const SizedBox(height: 50, width: 50),
               ],
             ),
           ],
         ),
       );
     }
    }

    3.3. Core logic: 

    Moving forward, we will leverage the code we have previously implemented, building upon the foundations we have laid thus far:

    3.3.1.  The first step is to create a component. As discussed earlier, all the individual elements in the game are considered components, so let’s create 1 component that will be our gaming avatar. For the UI of this avatar, we are going to use assets shown in Figure 03.

    For the avatar, we will be using SpriteAnimationComponents as we want this component to animate automatically.

    In the components directory, create a file called avatar_component.dart. This file will hold the logic of when and how our game avatar will move. 

    In the onLoad() method, we are loading the asset and using it to create animations, and in the update() method, we are using an enum to decide the walking animation.

    class AvatarComponent extends SpriteAnimationComponent with HasGameRef {
     final WalkingGame walkingGame;
     AvatarComponent({required this.walkingGame}) {
       add(RectangleHitbox());
     }
     late SpriteAnimation _downAnimation;
     late SpriteAnimation _leftAnimation;
     late SpriteAnimation _rightAnimation;
     late SpriteAnimation upAnimation;
     late SpriteAnimation _idleAnimation;
     final double _animationSpeed = .1;
    
    
     @override
     Future<void> onLoad() async {
       await super.onLoad();
    
    
       final spriteSheet = SpriteSheet(
         image: await gameRef.images.load(AssetConstants.avatarImage),
         srcSize: Vector2(2284 / 12, 1270 / 4),
       );
    
    
       _downAnimation =
           spriteSheet.createAnimation(row: 0, stepTime: _animationSpeed, to: 11);
       _leftAnimation =
           spriteSheet.createAnimation(row: 1, stepTime: _animationSpeed, to: 11);
       upAnimation =
           spriteSheet.createAnimation(row: 3, stepTime: _animationSpeed, to: 11);
       _rightAnimation =
           spriteSheet.createAnimation(row: 2, stepTime: _animationSpeed, to: 11);
       _idleAnimation =
           spriteSheet.createAnimation(row: 0, stepTime: _animationSpeed, to: 1);
       animation = _idleAnimation;
     }
    
    
     @override
     void update(double dt) {
       switch (walkingGame.direction) {
         case WalkingDirection.idle:
           animation = _idleAnimation;
           break;
         case WalkingDirection.down:
           animation = _downAnimation;
           if (y < walkingGame.mapHeight - height) {
             y += dt * walkingGame.characterSpeed;
           }
           break;
         case WalkingDirection.left:
           animation = _leftAnimation;
           if (x > 0) {
             x -= dt * walkingGame.characterSpeed;
           }
           break;
         case WalkingDirection.up:
           animation = upAnimation;
           if (y > 0) {
             y -= dt * walkingGame.characterSpeed;
           }
           break;
         case WalkingDirection.right:
           animation = _rightAnimation;
           if (x < walkingGame.mapWidth - width) {
             x += dt * walkingGame.characterSpeed;
           }
           break;
       }
       super.update(dt);
     }
    }

    3.1.2. Our Avatar is ready to walk now, but there is no map or world where he can do that. So, let’s create a game and add a background to it.  

    Create file name walking_game.dart in the lib directory and add the following code.

    class WalkingGame extends FlameGame with HasCollisionDetection {
     late double mapWidth =2520 ;
     late double mapHeight = 1300;
     WalkingDirection direction = WalkingDirection.idle;
     final double characterSpeed = 80;
     final _world = World();
    
    
     // avatar sprint
     late AvatarComponent _avatar;
    
    
     // Background image
     late SpriteComponent _background;
     final Vector2 _backgroundSize = Vector2(2520, 1300);
    
    
     // Camera Components
     late final CameraComponent _cameraComponent;
    
    
     @override
     Future<void> onLoad() async {
       await super.onLoad();
    
    
       overlays.add(KeyConstants.overlayKey);
    
    
       _background = SpriteComponent(
         sprite: Sprite(
           await images.load(AssetConstants.backgroundImage),
           srcPosition: Vector2(0, 0),
           srcSize: _backgroundSize,
         ),
         position: Vector2(0, 0),
         size: Vector2(2520, 1300),
       );
       _world.add(_background);
    
    
       _avatar = AvatarComponent(walkingGame: this)
         ..position = Vector2(529, 128)
         ..debugMode = true
         ..size = Vector2(1145 / 24, 635 / 8);
    
    
       _world.add(_avatar);
    
    
       _cameraComponent = CameraComponent(world: _world)
         ..setBounds(Rectangle.fromLTRB(390, 200, mapWidth - 390, mapHeight - 200))
         ..viewfinder.anchor = Anchor.center
         ..follow(_avatar);
    
    
       addAll([_cameraComponent, _world]);
     }
    }

    First thing in onLoad(), you can see that we are adding an overlay using a key. You can learn more about this key in the main class.

    Next is to create background components using SpriteComponent and add it to the world component. For creating the background component, we are using SpriteComponent instead of SpriteAnimationComponent because we do not need any background animation in our game.

    Then we add AvatarComponent in the same world component where we added the background component. To keep the camera fixed on the AvatarComponent, we are using 1 extra component, which is called CameraComponent.

    Lastly, we are adding both world & CameraComponents in our game by using addAll() method.

    3.1.3. Finally, we have to create the main.dart file. In this example, we are wrapping a GameWidget with MaterialApp because we want to use some features of material themes like icons, etc., in this project. If you do not want to do that, you can pass GameWidget to the runApp() method directly.
    Here we are not only adding the WalkingGame into GameWidget but also adding an overlay, which will show the control buttons. The key mentioned here for the overlay is the same key we added in walking_game.dart file’s onLoad() method.

    void main() {
     WidgetsFlutterBinding.ensureInitialized();
     Flame.device.fullScreen();
     runApp(MaterialApp(
       home: Scaffold(
         body: GameWidget(
           game: WalkingGame(),
           overlayBuilderMap: {
             KeyConstants.overlayKey: (BuildContext context, WalkingGame game) {
               return OverlayController(game: game);
             }
           },
         ),
       ),
     ));
    }

    After all this, our game will look like this, and with these 5 control buttons, we can tell your avatar to move and/or stop.

    4. Result

    For your convenience, the complete code for the project can be found here. Feel free to refer to this code repository for a comprehensive overview of the implementation details and to access the entirety of the game’s source code.

    5. Conclusion

    Flame game engine alleviates the burden of crucial tasks such as asset loading, managing refresh rates, and efficient memory management. By taking care of these essential functionalities, Flame allows developers to concentrate on implementing the core functionality and creating an exceptional game application.

    By leveraging Flame’s capabilities, you can maximize your productivity and create an amazing game application that resonates with players across various platforms, all while enjoying the benefits of a unified codebase.

    6. References

    1. https://docs.flutter.dev/
    2. https://pub.dev/packages/flame
    3. https://docs.flame-engine.org/latest
    4. https://medium.flutterdevs.com/flame-with-flutter-4c6c3bd8931c
    5. https://supabase.com/blog/flutter-real-time-multiplayer-game
    6. https://www.kodeco.com/27407121-building-games-in-flutter-with-flame-getting-started
    7. https://blog.codemagic.io/flutter-flame-game-development/
    8. https://codelabs.developers.google.com/codelabs/flutter-flame-game

  • Machine Learning in Flutter using TensorFlow

    Machine learning has become part of day-to-day life. Small tasks like searching songs on YouTube and suggestions on Amazon are using ML in the background. This is a well-developed field of technology with immense possibilities. But how we can use it?

    This blog is aimed at explaining how easy it is to use machine learning models (which will act as a brain) to build powerful ML-based Flutter applications. We will briefly touch base on the following points

    1. Definitions

    Let’s jump to the part where most people are confused. A person who is not exposed to the IT industry might think AI, ML, & DL are all the same. So, let’s understand the difference.  

    Figure 01

    1.1. Artificial Intelligence (AI): 

    AI, i.e. artificial intelligence, is a concept of machines being able to carry out tasks in a smarter way. You all must have used YouTube. In the search bar, you can type the lyrics of any song, even lyrics that are not necessarily the starting part of the song or title of songs, and get almost perfect results every time. This is the work of a very powerful AI.
    Artificial intelligence is the ability of a machine to do tasks that are usually done by humans. This ability is special because the task we are talking about requires human intelligence and discernment.

    1.2. Machine Learning (ML):

    Machine learning is a subset of artificial intelligence. It is based on the idea that we expose machines to new data, which can be a complete or partial row, and let the machine decide the future output. We can also say it is a sub-field of AI that deals with the extraction of patterns from data sets. With a new data set and processing, the last result machine will slowly reach the expected result. This means that the machine can find rules for optical behavior to get new output. It also can adapt itself to new changing data just like humans.

    1.3. Deep Learning (ML): 

    Deep learning is again a smaller subset of machine learning, which is essentially a neural network with multiple layers. These neural networks attempt to simulate the behavior of the human brain, so you can say we are trying to create an artificial human brain. With one layer of a neural network, we can still make approximate predictions, and additional layers can help to optimize and refine for accuracy.

    2. Types of ML

    Before starting the implementation, we need to know the types of machine learning because it is very important to know which type is more suitable for our expected functionality.

    Figure 02

    2.1. Supervised Learning

    As the name suggests, in supervised learning, the learning happens under supervision. Supervision means the data that is provided to the machine is already classified data i.e., each piece of data has fixed labels, and inputs are already mapped to the output.
    Once the machine is learned, it is ready for the classification of new data.
    This learning is useful for tasks like fraud detection, spam filtering, etc.

    2.2. Unsupervised Learning

    In unsupervised learning, the data given to machines to learn is purely raw, with no tags or labels. Here, the machine is the one that will create new classes by extracting patterns.
    This learning can be used for clustering, association, etc.

    2.3. Semi-Supervised Learning

    Both supervised and unsupervised have their own limitations, because one requires labeled data, and the other does not, so this learning combines the behavior of both learnings, and with that, we can overcome the limitations.
    In this learning, we feed row data and categorized data to the machine so it can classify the row data, and if necessary, create new clusters.

    2.4. : Reinforcement Learning

    For this learning, we feed the last output’s feedback with new incoming data to machines so they can learn from their mistakes. This feedback-based process will continue until the machine reaches the perfect output. This feedback is given by humans in the form of punishment or reward. This is like when a search algorithm gives you a list of results, but users do not click on other than the first result. It is like a human child who is learning from every available option and by correcting its mistakes, it grows.

    3. TensorFlow

    Machine learning is a complex process where we need to perform multiple activities like processing of acquiring data, training models, serving predictions, and refining future results.

    To perform such operations, Google developed a framework in November 2015 called TensorFlow. All the above-mentioned processes can become easy if we use the TensorFlow framework. 

    For this project, we are not going to use a complete TensorFlow framework but a small tool called TensorFlow Lite

    3.1. TensorFlow Lite

    TensorFlow Lite allows us to run the machine learning models on devices with limited resources, like limited RAM or memory.

    3.2. TensorFlow Lite Features

    • Optimized for on-device ML by addressing five key constraints: 
    • Latency: because there’s no round-trip to a server 
    • Privacy: because no personal data leaves the device 
    • Connectivity: because internet connectivity is not required 
    • Size: because of a reduced model and binary size
    • Power consumption: because of efficient inference and a lack of network connections
    • Support for Android and iOS devices, embedded Linux, and microcontrollers
    • Support for Java, Swift, Objective-C, C++, and Python programming languages
    • High performance, with hardware acceleration and model optimization
    • End-to-end examples for common machine learning tasks such as image classification, object detection, pose estimation, question answering, text classification, etc., on multiple platforms

    4. What is Flutter?

    Flutter is an open source, cross-platform development framework. With the help of Flutter by using a single code base, we can create applications for Android, iOS, web, as well as desktop. It was created by Google and uses Dart as a development language. The first stable version of Flutter was released in Apr 2018, and since then, there have been many improvements. 

    5. Building an ML-Flutter Application

    We are now going to build a Flutter application through which we can find the state of mind of a person from their facial expressions. The below steps explain the update we need to do for an Android-native application. For an iOS application, please refer to the links provided in the steps.

    5.1. TensorFlow Lite – Native setup (Android)

    • In android/app/build.gradle, add the following setting in the android block:
    aaptOptions {
            noCompress 'tflite'
            noCompress 'lite'
        }

    5.2. TensorFlow Lite – Flutter setup (Dart)

    • Create an assets folder and place your label file and model file in it. (These files we will create shortly.) In pubspec.yaml add:
    assets:
       - assets/labels.txt
       - assets/<file_name>.tflite

     

    Figure 02

    • Run this command (Install TensorFlow Light package): 
    $ flutter pub add tflite

    • Add the following line to your package’s pubspec.yaml (and run an implicit flutter pub get):
    dependencies:
         tflite: ^0.9.0

    • Now in your Dart code, you can use:
    import 'package:tflite/tflite.dart';

    • Add camera dependencies to your package’s pubspec.yaml (optional):
    dependencies:
         camera: ^0.10.0+1

    • Now in your Dart code, you can use:
    import 'package:camera/camera.dart';

    • As the camera is a hardware feature, in the native code, there are few updates we need to do for both Android & iOS.  To learn more, visit:
    https://pub.dev/packages/camera
    • Following is the code that will appear under dependencies in pubspec.yaml once the the setup is complete.
    Figure 03
    • Flutter will automatically download the most recent version if you ignore the version number of packages.
    • Do not forget to add the assets folder in the root directory.

    5.3. Generate model (using website)

    • Click on Get Started

    • Select Image project
    • There are three different categories of ML projects available. We’ll choose an image project since we’re going to develop a project that analyzes a person’s facial expression to determine their emotional condition.
    • The other two types, audio project and pose project, will be useful for creating projects that involve audio operation and human pose indication, respectively.

    • Select Standard Image model
    • Once more, there are two distinct groups of image machine learning projects. Since we are creating a project for an Android smartphone, we will select a standard picture project.
    • The other type, an Embedded Image Model project, is designed for hardware with relatively little memory and computing power.

    • Upload images for training the classes
    • We will create new classes by clicking on “Add a class.”
    • We must upload photographs to these classes as we are developing a project that analyzes a person’s emotional state from their facial expression.
    • The more photographs we upload, the more precise our result will be.
    • Click on train model and wait till training is over
    • Click on Export model
    • Select TensorFlow Lite Tab -> Quantized  button -> Download my model

    5.4. Add files/models to the Flutter project

    • Labels.txt

    File contains all the class names which you created during model creation.

     

    • *.tflite

    File contains the original model file as well as associated files a ZIP.

    5.5. Load & Run ML-Model

    • We are importing the model from assets, so this line of code is crucial. This model will serve as the project’s brain.
    • Here, we’re configuring the camera using a camera controller and obtaining a live feed (Cameras[0] is the front camera).

    6. Conclusion

    We can achieve good performance of a Flutter app with an appropriate architecture, as discussed in this blog.

  • How to Test the Performance of Flutter Apps – A Step-by-step Guide

    The rendering performance of a mobile app is crucial in ensuring a smooth and delightful user experience. 

    We will explore various ways of measuring the rendering performance of a mobile application, automate this process, and understand the intricacies of rendering performance metrics.

    Internals of Flutter and the Default Performance

    Before we begin exploring performance pitfalls and optimizations, we first need to understand the default performance of a basic hello-world Flutter app. Flutter apps are already highly optimized for speed and are known to perform better than existing cross-platform application development platforms, such as React Native or Apache Cordova.

    By default, Flutter apps aim to render at 60 frames per second on most devices and up to 120 frames per second on devices that support a 120 Hz refresh rate. This is made possible because of Flutter’s unique rendering mechanism.

    It doesn’t render UI components like a traditional mobile application framework, which composes native widgets on the screen. Instead, it uses a high-performance graphics engine, Skia, and renders all components on the screen as if they were part of a single two-dimensional scene.

    Skia is a highly optimized, two-dimensional graphics engine used by a variety of apps, such as Google Chrome, Fuchsia, Chrome OS, Flutter, etc. This game-like rendering behavior gives Flutter an advantage over existing applications SDK when it comes to default performance.

    Common Performance Pitfalls:

    Now, let’s understand some common performance issues or pitfalls seen in mobile applications. Some of them are listed below: 

    • Latency introduced because of network and disk IO 
    • When heavy computations are done on the main UI thread
    • Frequent and unnecessary state updates
    • Jittery UX due to lack of progressive or lazy loading of images and assets
    • Unoptimized or very large assets can take a lot of time to render

    To identify and fix these performance bottlenecks, mobile apps can be instrumented for time complexity and/or space complexity.

    Most of these issues can be identified using a profile. Profiling an app means dynamically analyzing the application’s code, in a runtime environment, for CPU and memory usage and sometimes the usage of other resources, such as network and battery. Performance profiling entails analyzing CPU usage for time complexity to identify parts of the application where CPU usage is high and beyond a certain threshold. Let’s see how profiling works in the Flutter ecosystem.

    How to Profile a Flutter App

    Below are a set of steps that you may follow along to set up profiling on a Flutter app.

    1. Launch the application in profile mode. To do so, we can run the app using the command
    flutter run --profile

    on the terminal or set up a launch configuration for the IDE or code editor. Testing the performance of Flutter apps in profile mode and not in debug (dev) mode ensures that the true release performance of the application is assessed. Dev mode has additional pieces of code running that aren’t part of release builds.

    1. Some developers may need to activate Flutter ‘devtools’ by executing this command: 
    flutter pub global activate devtools

    1. To set up ‘profile mode’, launch the configuration for a Flutter app in VSCode; edit or create the file at project_directory/.vscode/launch.json and create a launch configuration “Profile” as follows:
    {
     "version": "0.2.0",
     "configurations": [
       {
         "name": "Development",
         "request": "launch",
         "type": "dart"
       },
       {
         "name": "Profile",
         "request": "launch",
         "type": "dart",
         "flutterMode": "profile"
       }
     ]
    }

    1. Once the application is running on a real device, go to the timeline view of the DevTools and enable performance overlays. This allows developers to see two graphs on top of each other and overlaid on top of the application. The top graph represents the raster thread timeline, and the second graph below it represents the UI thread timeline. 

    ⚠️ Caution: It is recommended that performance profiling of a Flutter application should only be done on a real device and not on any simulator or emulator. Simulators are not an exact representation of a real device when it comes to hardware and software capabilities, disk IO latency, display refresh rate, etc. Furthermore, the profiling is best done on the slowest, oldest device that the application targets. This ensures that the application is well-tested for performance pitfalls on target platforms and will offer a smooth user experience to end-users.

    Understanding the Performance Overlays

    Once the timeline view is enabled in profile mode, the application’s running instance gets an overlay on the top area. This overlay has two charts on top of each other.

    Both charts display timeline metrics 300 frames at a time. Any frame going over the horizontal black lines on the chart means that the frame is taking more than 16 milliseconds to render, which leads to a frame drop and eventually a jittery user experience.

    Fig:- Dart profiler for optimal rendering

    Look at the timeline above. No frames are going over the black lines,  i.e., no frame takes more than 16 milliseconds to render. This represents an optimal rendering with no frame drops, i.e., no jank for end users.

    Fig:- Dart profiler for suboptimal rendering

    Here, some frames in the timeline above are going over the horizontal black lines, i.e., some frames are taking more than 16 milliseconds to render. That is because the application was trying to load an image from the network while the user was also scrolling through the page. This means there is some performance bottleneck in this part of the application, which can be further optimized to ensure smoother rendering, i.e., a jank-free end-user experience.

    The two graphs mentioned above can be described as:

    1. UI thread: This is the first chart, and it portrays the timeline view of all the dart code executions. Instructions written by developers are executed on this thread, and a layer tree (for rendering) is created, which is then sent to the raster thread for rendering. 
    2. Raster thread: The raster thread runs the Skia engine and talks to the GPU and is responsible for drawing the screen’s layer tree. Developers can not directly instruct the GPU thread. Most performance optimizations are applicable to the UI thread because the raster thread is already optimized by the Flutter dev team.

    Automatically Testing for Jank:

    Profiling the app gives some idea of which screens and user interaction may be optimized for performance, but it doesn’t actually give a concrete reproducible assessment. So, let’s write some code to automate the process of profiling and detecting sources of lag in our Flutter app.

    First, include the Flutter driver extension in the application’s main entrypoint file and enable the Flutter drive extension. In most cases, this file is called main.dart and invokes the runApp() method.

    import 'package:flutter_driver/driver_extension.dart';
     
    void main() {
      enableFlutterDriverExtension();
      runApp(MyApp());
    }

    Next, let’s write a Flutter driver script to drive parts of the application that need to be profiled. Any and all user behavior such as navigation, taps, scroll, multipoint touches, and gestures can be simulated by a driver script.

    To measure the app’s rendering performance, we will make sure that we are driving and testing parts of the application exactly like a user would do, i.e., we need to test interactions like click or scroll and transitions like page changes and back navigation. Flutter driver makes this simpler by introducing a huge set of methods such as find(), tap(), scroll(), etc.

    The driver script will also have to account for and mock any sources of latency, such as time taken during API calls or while reading a file from the local file system.

    We also need to run these automated tests multiple times to draw conclusions from average render times.

    The following test driver script checks for a simple user interaction:

    • Launches the app
    • Waits for a list of items
    • Finds and clicks on the first list item, which takes users to a different page
    • Views some information on the page
    • Presses the back button to go back to the list

    The script also does the following:

    • Tracks time taken during each user interaction by wrapping interactions inside the driver.traceAction() method
    • Records and writes the UI thread and the raster thread timelines to a file ui_timeline.json
    import 'package:flutter_driver/flutter_driver.dart';
    import 'package:test/test.dart';
     
    void main() {
     group('App name - home', () {
       FlutterDriver driver;
     
       setUpAll(() async {
         driver = await FlutterDriver.connect();
       });
     
       tearDownAll(() async {
         if (driver != null) {
           driver.close();
         }
       });
     
       test('list has row items', () async {
         final timeline = await driver.traceAction(() async {
           // wait for list items
           await driver.waitFor(find.byValueKey('placesList'));
     
           // get the first row in the list
           final firstRow = find.descendant(
               of: find.byValueKey('placesList'),
               matching: find.byType('PlaceRow'),
               firstMatchOnly: true);
     
           // tap on the first row
           await driver.tap(firstRow);
     
           // wait for place details
           await driver.waitFor(find.byValueKey("placeDetails"));
     
           // go back to lists
           await driver.tap(find.byTooltip('Back'));
         });
     
         // write summary to a file
         final summary = new TimelineSummary.summarize(timeline);
         await summary.writeSummaryToFile('ui_timeline', pretty: true);
         await summary.writeTimelineToFile('ui_timeline', pretty: true);
       });
     });

    To run the script, the following command can be executed on the terminal:

    flutter drive -t lib/main.dart --driver test_driver/main_test.dart --profile

    The test driver creates a release-like app bundle that is installed on the target device and driven by the driver script. This test is recommended to be run on a real device, preferably the slowest device targeted by the app.

    Once the script finishes execution, two json files are written to the build directory.

    ./build/ui_timeline.timeline_summary.json
    ./build/ui_timeline.timeline.json

    Viewing the Results:

    Launch the Google Chrome web browser and go to URL: chrome://tracing. Click on the load button on the top left and load the file ui_timeline.timeline.json.

    The timeline summary when loaded into the tracing tool can be used to walk through the hierarchical timeline of the application and exposes various metrics, such as CPU duration, start time, etc., to better understand sources of performance issues in the app. The tracing tool is versatile and displays methods invoked under the hood in a hierarchical view that can be navigated through by mouse or by pressing A, S, D, F keys. 

    Fig:- Chrome tracing in action

    The other file, i.e., the timeline_summary file, can be opened in a code editor and eye-balled for performance data. It provides a set of metrics related to the performance of the application. For example, the flutter_driver script above outputs the following timeline on a single run:

    "average_frame_build_time_millis": 1.6940195121951216,
     "90th_percentile_frame_build_time_millis": 2.678,
     "99th_percentile_frame_build_time_millis": 7.538,
     "worst_frame_build_time_millis": 14.687,
     "missed_frame_build_budget_count": 0,
     "average_frame_rasterizer_time_millis": 6.147395121951226,
     "90th_percentile_frame_rasterizer_time_millis": 9.029,
     "99th_percentile_frame_rasterizer_time_millis": 15.961,
     "worst_frame_rasterizer_time_millis": 21.476,
     "missed_frame_rasterizer_budget_count": 2,
     "frame_count": 205,
     "frame_rasterizer_count": 205,
     "average_vsync_transitions_missed": 1.5,
     "90th_percentile_vsync_transitions_missed": 2.0,
     "99th_percentile_vsync_transitions_missed": 2.0
    }

    Each of these metrics can be inspected, analyzed, and optimized. For example, the value of average_frame_build_time_millis should always be below 16 milliseconds to ensure that the app runs at 60 frames per second. 

    More details about each of these fields can be found here.

    Conclusion

    In this blog post, we explored how to profile and measure the performance of a Flutter application. We also explored ways to identify and fix performance pitfalls, if any.

    We then created a Flutter driver script to automate performance testing of Flutter apps and produce a summary of rendering timelines as well as various performance metrics such as average_frame_build_time_millis.

    The automated performance tests ensure that the app is tested for performance in a reproducible way against different devices and can be run multiple times to calculate a running average and draw various insights. These metrics can be objectively looked at to measure the performance of an application and fix any bottlenecks in the application.

    A performant app means faster rendering and optimal resource utilization, which is essential to ensuring a jank-free and smooth user experience. It also contributes greatly to an app’s popularity. Do try profiling and analyzing the performance of some of your Flutter apps!

    Related Articles

    1. A Primer To Flutter

    2. Building High-performance Apps: A Checklist To Get It Right

  • A Primer To Flutter

    In this blog post, we will explore the basics of cross platform mobile application development using Flutter, compare it with existing cross-platform solutions and create a simple to-do application to demonstrate how quickly we can build apps with Flutter.

    Brief introduction

    Flutter is a free and open source UI toolkit for building natively compiled applications for mobile platforms like Android and iOS, and for the web and desktop as well. Some of the prominent features are native performance, single codebase for multiple platforms, quick development, and a wide range of beautifully designed widgets.

    Flutter apps are written in Dart programming language, which is a very intuitive language with a C-like syntax. Dart is optimized for performance and developer friendliness. Apps written in Dart can be as fast as native applications because Dart code compiles down to machine instructions for ARM and x64 processors and to Javascript for the web platform. This, along with the Flutter engine, makes Flutter apps platform agnostic.

    Other interesting Dart features used in Flutter apps is the just-in-time (JIT) compiler, used during development and debugging, which powers the hot reload functionality. And the ahead-of-time (AOT) compiler which is used when building applications for the target platforms such as Android or iOS, resulting in native performance.

    Everything composed on the screen with Flutter is a widget including stuff like padding, alignment or opacity. The Flutter engine draws and controls each pixel on the screen using its own graphics engine called Skia.

    Flutter vs React-Native

    Flutter apps are truly native and hence offer great performance, whereas apps built with react-native requires a JavaScript bridge to interact with OEM widgets. Flutter apps are much faster to develop because of a wide range of built-in widgets, good amount of documentation, hot reload, and several other developer-friendly choices made by Google while building Dart and Flutter. 

    React Native, on the other hand, has the advantage of being older and hence has a large community of businesses and developers who have experience in building react-native apps. It also has more third party libraries and packages as compared to Flutter. That said, Flutter is catching up and rapidly gaining momentum as evident from Stackoverflow’s 2019 developer survey, where it scored 75.4% under “Most Loved Framework, Libraries and Tools”.

     

    All in all, Flutter is a great tool to have in our arsenal as mobile developers in 2020.

    Getting started with a sample application

    Flutter’s official docs are really well written and include getting started guides for different OS platforms, API documentation, widget catalogue along with several cookbooks and codelabs that one can follow along to learn more about Flutter.

    To get started with development, we will follow the official guide which is available here. Flutter requires Flutter SDK as well as native build tools to be installed on the machine to begin development. To write apps, one may use Android Studios or VS Code, or any text editor can be used with Flutter’s command line tools. But a good rule of thumb is to install Android Studio because it offers better support for management of Android SDK, build tools and virtual devices. It also includes several built-in tools such as the icons and assets editor.

    Once done with the setup, we will start by creating a project. Open VS Code and create a new Flutter project:

    We should see the main file main.Dart with some sample code (the counter application). We will start editing this file to create our to-do app.

    Some of the features we will add to our to-do app:

    • Display a list of to-do items
    • Mark to-do items as completed
    • Add new item to the list

    Let’s start by creating a widget to hold our list of to-do items. This is going to be a StatefulWidget, which is a type of widget with some state. Flutter tracks changes to the state and redraws the widget when a new change in the state is detected.

    After creating theToDoList widget, our main.Dart file looks like this:

    /// imports widgets from the material design 
    import 'package:flutter/material.dart';
    
    void main() => runApp(TodoApp());
    
    /// Stateless widgets must implement the build() method and return a widget. 
    /// The first parameter passed to build function is the context in which this widget is built
    class TodoApp extends StatelessWidget {
      @override
      Widget build(BuildContext context) {
        return MaterialApp(
          title: 'TODO',
          theme: ThemeData(
            primarySwatch: Colors.blue,
          ),
          home: TodoList(),
        );
      }
    }
    
    /// Stateful widgets must implement the createState method
    /// State of a stateless widget against has a build() method with context
    class TodoList extends StatefulWidget {
      @override
      State<StatefulWidget> createState() => TodoListState();
    }
    
    class TodoListState extends State<TodoList> {
      @override
      Widget build(BuildContext context) {
        return Scaffold(
          appBar: AppBar(
            title: Text('Todo'),
          ),
          body: Text('Todo List'),
        );
      }
    }

    The ToDoApp class here extends Stateless widget i.e. a widget without any state whereas ToDoList extends StatefulWidget. All Flutter apps are a combination of these two types of widgets. StatelessWidgets must implement the build() method whereas Stateful widgets must implement the createState() method.

    Some built-in widgets used here are the MaterialApp widget, the Scaffold widget and AppBar and Text widgets. These all are imported from Flutter’s implementation of material design, available in the material.dart package. Similarly, to use native looking iOS widgets in applications, we can import widgets from the flutter/cupertino.dart package.

    Next, let’s create a model class that represents an individual to-do item. We will keep this simple i.e. only store label and completed status of the to-do item.

    class Todo {
      final String label;
      bool completed;
      Todo(this.label, this.completed);
    }

    The constructor we wrote in the code above is implemented using one of Dart’s syntactic sugar to assign a constructor argument to the instance variable. For more such interesting tidbits, take the Dart language tour.

    Now let’s modify the ToDoListState class to store a list of to-do items in its state and also display it in a list. We will use ListView.builder to create a dynamic list of to-do items. We will also use Checkbox and Text widget to display to-do items.

    /// State is composed all the variables declared in the State implementation of a Stateful widget
    class TodoListState extends State<TodoList> {
      final List<Todo> todos = List<Todo>();
      @override
      Widget build(BuildContext context) {
        return Scaffold(
          appBar: AppBar(
            title: Text('Todo'),
          ),
          body: Padding(
            padding: EdgeInsets.all(16.0),
            child: todos.length > 0
                ? ListView.builder(
                    itemCount: todos.length,
                    itemBuilder: _buildRow,
                  )
                : Text('There is nothing here yet. Start by adding some Todos'),
          ),
        );
      }
    
      /// build a single row of the list
      Widget _buildRow(context, index) => Row(
            children: <Widget>[
              Checkbox(
                  value: todos[index].completed,
                  onChanged: (value) => _changeTodo(index, value)),
              Text(todos[index].label,
                  style: TextStyle(
                      decoration: todos[index].completed
                          ? TextDecoration.lineThrough
                          : null))
            ],
          );
    
      /// toggle the completed state of a todo item
      _changeTodo(int index, bool value) =>
          setState(() => todos[index].completed = value);
    }

    A few things to note here are: private functions start with an underscore, functions with a single line of body can be written using fat arrows (=>) and most importantly, to change the state of any variable contained in a Stateful widget, one must call the setState method.

    The ListView.builder constructor allows us to work with very large lists, since list items are created only when they are scrolled.

    Another takeaway here is the fact that Dart is such an intuitive language that it is quite easy to understand and you can start writing Dart code immediately.

    Everything on a screen, like padding, alignment or opacity, is a widget. Notice in the code above, we have used Padding as a widget that wraps the list or a text widget depending on the number of to-do items. If there’s nothing in the list, a text widget is displayed with some default message.

    Also note how we haven’t used the new keyword when creating instances of a class, say Text. That’s because using the new keyword is optional in Dart and discouraged, according to the effective Dart guidelines.

    Running the application

    At this point, let’s run the code and see how the app looks on a device. Press F5, then select a virtual device and wait for the app to get installed. If you haven’t created a virtual device yet, refer to the getting started guide.

    Once the virtual device launches, we should see the following screen in a while. During development, the first launch always takes a while because the entire app gets built and installed on the virtual device, but subsequent changes to code are instantly reflected on the device, thanks to Flutter’s amazing hot reload feature. This reduces development time and also allows developers and designers to experiment more frequently with the interface changes.

    As we can see, there are no to-dos here yet. Now let’s add a floating action button that opens a dialog which we will use to add new to-do items.

    Adding the FAB is as easy as passing floatingActionButton parameter to the scaffold widget.

    floatingActionButton: FloatingActionButton(
      child: Icon(Icons.add),                                /// uses the built-in icons
      onPressed: () => _promptDialog(context),
    ),

    And declare a function inside ToDoListState that displays a popup (AlertDialog) with a text input box.

    /// display a dialog that accepts text
      _promptDialog(BuildContext context) {
        String _todoLabel = '';
        return showDialog(
            context: context,
            builder: (context) {
              return AlertDialog(
                title: Text('Enter TODO item'),
                content: TextField(
                    onChanged: (value) => _todoLabel = value,
                    decoration: InputDecoration(hintText: 'Add new TODO item')),
                actions: <Widget>[
                  FlatButton(
                    child: new Text('CANCEL'),
                    onPressed: () => Navigator.of(context).pop(),
                  ),
                  FlatButton(
                    child: new Text('ADD'),
                    onPressed: () {
                      setState(() => todos.add(Todo(_todoLabel, false)));
                      /// dismisses the alert dialog
                      Navigator.of(context).pop();
                    },
                  )
                ],
              );
            });
      }

    At this point, saving changes to the file should result in the application getting updated on the virtual device (hot reload), so we can just click on the new floating action button that appeared on the bottom right of the screen and start testing how the dialog looks.

    We used a few more built-in widgets here:

    • AlertDialog: a dialog prompt that opens up when clicking on the FAB
    • TextField: text input field for accepting user input
    • InputDecoration: a widget that adds style to the input field
    • FlatButton: a variation of button with no border or shadow
    • FloatingActionButton: a floating icon button, used to trigger primary action on the screen

    Here’s a quick preview of how the application should look and function at this point:

    And just like that, in less than 100 lines of code, we’ve built the user interface of a simple, cross platform to-do application.

    The source code for this application is available here.

    A few links to further explore Flutter:

    Conclusion:

    To conclude, Flutter is  an extremely powerful toolkit to build cross platform applications that have native performance and are beautiful to look at. Dart, the language behind Flutter, is designed considering the nuances of user interface development and Flutter offers a wide range of built-in widgets. This makes development fun and development cycles shorter; something that we experienced while building the to-do app. With Flutter, time to market is also greatly reduced which enables teams to experiment more often, collect more feedback and ship applications faster.  And finally, Flutter has a very enthusiastic and thriving community of designers and developers who are always experimenting and adding to the Flutter ecosystem.

  • How to build High-Performance Flutter Apps using Streams

    Performance is a significant factor for any mobile app, and multiple factors like architecture, logic, memory management, etc. cause low performance. When we develop an app in Flutter, the initial performance results are very good, but as development progresses, the negative effects of a bad codebase start showing up.  This blog is aimed at using an architecture that improves Flutter app performance. We will briefly touch base on the following points:

    1. What is High-Performance Architecture?

    1.1. Framework

    1.2. Motivation

    1.3. Implementation

    2. Sample Project

    3. Additional benefits

    4. Conclusion

    1. What is High-Performance Architecture?

    This Architecture uses streams instead of the variable-based state management approach. Streams are the most preferred approach for scenarios in which an app needs data in real-time. Even with these benefits, why are streams not the first choice for developers? One of the reasons is that streams are considered difficult and complicated, but that reputation is slightly overstated. 
    Dart is a programming language designed to have a reactive style system, i.e., architecture, with observable streams, as quoted by Flutter’s Director of Engineering, Eric Seidel in this podcast. [Note: The podcast’s audio is removed but an important part which is related to this architecture can be heard here in Zaiste’s youtube video.] 

    1.1. Framework: 

      Figure 01

    As shown in figure 01, we have 3 main components:

    • Supervisor: The Supervisor wraps the complete application, and is the Supervise responsible for creating the singleton of all managers as well as providing this manager’s singleton to the required screen.
    • Managers: Each Manager has its own initialized streams that any screen can access by accessing the respective singleton. These streams can hold data that we can use anywhere in the application. Plus, as we are using streams, any update in this data will be reflected everywhere at the same time.
    • Screens: Screens will be on the receiver end of this project. Each screen uses local streams for its operations, and if global action is required, then accesses streams from managers using a singleton.

    1.2. Motivation:

    Zaiste proposed an idea in 2019 and created a plugin for such architecture. He named it “Sprinkel Architecture” and his plugin is called sprinkle which made our development easy to a certain level. But as of today, his plugin does not support new null safety features introduced in Dart 2.12.0. You can check more about his implementation here and can try his given sample with following command:

    flutter run -–no-sound-null-safety

    1.3. Implementation:

    We will be using the get plugin and rxdart plugins in combination to create our high performance architecture.

    The Rxdart plugin will handle the stream creation and manipulation, whereas the get plugin can help us in dependency injection, route management, as well as state management.

    2. Sample Project:

    We will create a sample project to understand how to implement this architecture.

    2.1. Create a project using following command:

    flutter create sprinkle_architecture

    2.2. Add these under dependencies of pubspec.yaml (and run command flutter pub get):

    get: ^4.6.5
    Rxdart: ^0.27.4

    2.3. Create 3 directories, constants, managers, and views, inside the lib directory:

    2.4. First, we will start with a manager who will have streams & will increment the counter. Create dart file with name counter_manager.dart under managers directory:

    import 'package:get/get.dart';
    
    class CounterManager extends GetLifeCycle {
        final RxInt count = RxInt(0);
        int get getCounter => count.value;
        void increment() => count.value = count.value + 1;
    } 

    2.5. With this, we have a working manager. Next, we’ll create a Supervisor who will create a singleton of all available managers. In our case, we’ll create a singleton of only one manager. Create a supervisor.dart file in the lib directory:

    import 'package:get/get.dart';
    import 'package:sprinkle_architecture/managers/counter_manager.dart';
    
    abstract class Supervisor {
     static Future<void> init() async {
       _initManagers();
     }
    
     static void _initManagers() {
       Get.lazyPut<CounterManager>(() => CounterManager());
     }
    }

    2.6. This application only has 1 screen, but it is a good practice to create constants related to routing, so let’s add route details. Create a dart file route_paths.dart:

    abstract class RoutePaths {
      static const String counterPage = '/';
    }

    2.7. And route_pages.dart under constants directory:

    import 'package:get/get.dart';
    import 'package:sprinkle_architecture_exp/constants/route_paths.dart';
    import 'package:sprinkle_architecture_exp/views/counter_page.dart';
    
    abstract class RoutePages {
     static final List<GetPage<dynamic>> pages = <GetPage<dynamic>>[
       GetPage<void>(
         name: RoutePaths.counterPage,
         page: () => const CounterPage(title: 'Flutter Demo Home Page'),
         binding: CounterPageBindings(),
       ),
     ];
    }
    
    class CounterPageBindings extends Bindings {
     @override
     void dependencies() => Get.lazyPut<CounterManager>(() => CounterManager());
    }

    2.8. Now, we have a routing constant that we can use. But do not have a CounterPage Class. But before creating this class, let’s update our main file:

    import 'package:flutter/material.dart';
    import 'package:get/get_navigation/src/root/get_material_app.dart';
    import 'package:sprinkle_architecture_exp/constants/route_pages.dart';
    import 'package:sprinkle_architecture_exp/constants/route_paths.dart';
    import 'package:sprinkle_architecture_exp/supervisor.dart';
    
    void main() {
     WidgetsFlutterBinding.ensureInitialized();
     Supervisor.init();
     runApp(
       GetMaterialApp(
         initialRoute: RoutePaths.counterPage,
         getPages: RoutePages.pages,
       ),
     );
    }

    2.9. Finally, add the file counter_page_controller.dart:

    import 'package:get/get.dart';
    import 'package:sprinkle_architecture_exp/managers/counter_manager.dart';
    
    class CounterPageController extends GetxController {
     final CounterManager manager = Get.find();
    }

    2.10. As well as our landing page  counter_page.dart:

    import 'package:flutter/material.dart';
    import 'package:flutter/widgets.dart';
    import 'package:get/get.dart';
    import 'package:sprinkle_architecture_exp_2/views/counter_page_controller.dart';
    
    class CounterPage extends GetWidget<CounterPageController> {
     const CounterPage({Key? key, required this.title}) : super(key: key);
     final String title;
    
     CounterPageController get c => Get.put(CounterPageController());
    
     @override
     Widget build(BuildContext context) {
       return Obx(() {
         return Scaffold(
           appBar: AppBar(title: Text(title)),
           body: Center(
             child: Column(
               mainAxisAlignment: MainAxisAlignment.center,
               children: <Widget>[
                 const Text('You have pushed the button this many times:'),
                 Text('${c.manager.getCounter}',
                     style: Theme.of(context).textTheme.headline4),
               ],
             ),
           ),
           floatingActionButton: FloatingActionButton(
             onPressed: c.manager.increment,
             tooltip: 'Increment',
             child: const Icon(Icons.add),
           ),
         );
       });
     }
    }

    2.11. The get plugin allows us to add 1 controller per screen by using the GetxController class. In this controller, we can do operations whose scope is limited to our screen. Here, CounterPageController provides CounterPage the singleton on CounterManger.

    If everything is done as per the above commands, we will end up with the following tree structure:

    2.12. Now we can test our project by running the following command:

    flutter run

    3. Additional Benefits:

    3.1. Self Aware UI:

    As all managers in our application are using streams to share data, whenever a screen changes managers’ data, the second screens with dependency on that data also update themselves in real-time. This will happen because of the listen() property of streams. 

    3.2. Modularization:

    We have separate managers for handling REST APIs, preferences, appStateInfo, etc. So, the modularization happens automatically. Plus UI logic gets separate from business logic as we are using getXController provided by the get plugin

    3.3. Small rebuild footprint:

    By default, Flutter rebuilds the whole widget tree for updating the UI but with the get and rxdart plugins, only the dependent widget refreshes itself.

    4. Conclusion

    We can achieve good performance of a Flutter app with an appropriate architecture as discussed in this blog. 

  • How to setup iOS app with Apple developer account and TestFlight from scratch

    In this article, we will discuss how to set up the Apple developer account, build an app (create IPA files), configure TestFlight, and deploy it to TestFlight for the very first time.

    There are tons of articles explaining how to configure and build an app or how to setup TestFlight or setup application for ad hoc distribution. However, most of them are either outdated or missing steps and can be misleading for someone who is doing it for the very first time.

    If you haven’t done this before, don’t worry, just traverse through the minute details of this article, follow every step correctly, and you will be able to set up your iOS application end-to-end, ready for TestFlight or ad hoc distribution within an hour.

    Prerequisites

    Before we start, please make sure, you have:

    • A React Native Project created and opened in the XCode
    • XCode set up on your Mac
    • An Apple developer account with access to create the Identifiers and Certificates, i.e. you have at least have a Developer or Admin access – https://developer.apple.com/account/
    • Access to App Store Connect with your apple developer account -https://appstoreconnect.apple.com/
    • Make sure you have an Apple developer account, if not, please get it created first.

    The Setup contains 4 major steps: 

    • Creating Certificates, Identifiers, and Profiles from your Apple Developer account
    • Configuring the iOS app using these Identifiers, Certificates, and Profiles in XCode
    • Setting up TestFlight and Internal Testers group on App Store Connect
    • Generating iOS builds, signing them, and uploading them to TestFlight on App Store Connect

    Certificates, Identifiers, and Profiles

    Before we do anything, we need to create:

    • Bundle Identifier, which is an app bundle ID and a unique app identifier used by the App Store
    • A Certificate – to sign the iOS app before submitting it to the App Store
    • Provisioning Profile – for linking bundle ID and certificates together

    Bundle Identifiers

    For the App Store to recognize your app uniquely, we need to create a unique Bundle Identifier.

    Go to https://developer.apple.com/account: you will see the Certificates, Identifiers & Profiles tab. Click on Identifiers. 

    Click the Plus icon next to Identifiers:

    Select the App IDs option from the list of options and click Continue:

    Select App from app types and click Continue

    On the next page, you will need to enter the app ID and select the required services your application can have if required (this is optional—you can enable them in the future when you actually implement them). 

    Keep those unselected for now as we don’t need them for this setup.

    Once filled with all the information, please click on continue and register your Bundle Identifier.

    Generating Certificate

    Certificates can be generated 2 ways:

    • By automatically managing certificates from Xcode
    • By manually generating them

    We will generate them manually.

    To create a certificate, we need a Certificate Signing Request form, which needs to be generated from your Mac’s KeyChain Access authority.

    Creating Certificate Signing Request:

    Open the KeyChain Access application and Click on the KeyChain Access Menu item at the left top of the screen, then select Preferences

    Select Certificate Assistance -> Request Certificate from Managing Authority

    Enter the required information like email address and name, then select the Save to Disk option.

    Click Continue and save this form to a place so you can easily upload it to your Apple developer account

    Now head back to the Apple developer account, click on Certificates. Again click on the + icon next to Certificates title and you will be taken to the new certificate form.

    Select the iOS Distribution (App Store and ad hoc) option. Here, you can select the required services this certificate will need from a list of options (for example, Apple Push Notification service). 

    As we don’t need any services, ignore it for now and click continue.

    On the next screen, upload the certificate signing request form we generated in the last step and click Continue.

    At this step, your certificate will be generated and will be available to download.

    NOTE: The certificate can be downloaded only once, so please download it and keep it in a secure location to use it in the future.

    Download your certificate and install it by clicking on the downloaded certificate file. The certificate will be installed on your mac and can be used for generating builds in the next steps.

    You can verify this by going back to the KeyChain Access app and seeing the newly installed certificate in the certificates list.

    Generating a Provisioning Profile

    Now link your identifier and certificate together by creating a provisioning profile.

    Let’s go back to the Apple developer account, select the profiles option, and select the + icon next to the Profiles title.

    You will be redirected to the new Profiles form page.

    Select Distribution Profile and click continue:

    Select the App ID we created in the first step and click Continue:

    Now, select the certificate we created in the previous step:

    Enter a Provisioning Profile name and click Generate:

    Once Profile is generated, it will be available to download, please download it and keep it at the same location where you kept Certificate for future usage.

    Configure App in XCode

    Now, we need to configure our iOS application using the bundle ID and the Apple developer account we used for generating the certificate and profiles.

    Open the <appname>.xcworkspace file in XCode and click on the app name on the left pan. It will open the app configuration page.

    Select the app from targets, go to signing and capabilities, and enter the bundle identifier. 

    Now, to automatically manage the provisioning profile, we need to download the provisioning profile we generated recently. 

    For this, we need to sign into XCode using your Apple ID.

    Select Preferences from the top left XCode Menu option, go to Accounts, and click on the + icon at the bottom.

    Select Apple ID from the account you want to add to the list, click continue and enter the Apple ID.

    It will prompt you to enter the password as well.

    Once successfully logged in, XCode will fetch all the provisioning profiles associated with this account. Verify that you see your project in the Teams section of this account page.

    Now, go back to the XCode Signing Capabilities page, select Automatically Manage Signing, and then select the required team from the Team dropdown.

    At this point, your application will be able to generate the Archives to upload it to either TestFlight or Sign them ad hoc to distribute it using other mediums (Diawi, etc.).

    Setup TestFlight

    TestFlight and App Store management are managed by the App Store Connect portal.

    Open the App Store Connect portal and log in to the application.

    After you log in, please make sure you have selected the correct team from the top right corner (you can check the team name just below the user name).

    Select My Apps from the list of options. 

    If this is the first time you are setting up an application on this team, you will see the + (Add app) option at the center of the page, but if your team has already set up applications, you will see the + icon right next to Apps Header.

    Click on the + icon and select New App Option:

    Enter the complete app details, like platform (iOS, MacOS OR tvOS), aApp name, bundle ID (the one we created), SKU, access type, and click the Create button.

    You should now be able to see your newly created application on the Apps menu. Select the app and go to TestFlight. You will see no builds there as we did not push any yet.

    Generate and upload the build to TestFlight

    At this point, we are fully ready to generate a build from XCode and push it to TestFlight. To do this, head back to XCode.

    On the top middle section, you will see your app name and right arrow. There might be an iPhone or other simulator selected. Pplease click on the options list and select Any iOS Device.

    Select the Product menu from the Menu list and click on the Archive option.

    Once the archive succeeds, XCode will open the Organizer window (you can also open this page from the Windows Menu list).

    Here, we sign our application archive (build) using the certificate we created and upload it to the App Store Connect TestFlight.

    On the Organizer window, you will see the recently generated build. Please select the build and click on Distribute Button from the right panel of the Organizer page.

    On the next page, select App Store Connect from the “Select a method of distribution” window and click Continue.

    NOTE: We are selecting the App Store Connect option as we want to upload a build to TestFlight, but if you want to distribute it privately using other channels, please select the Ad Hoc option.

    Select Upload from the “Select a Destination” options and click continue. This will prepare your build to submit it to App Store Connect TestFlight.

    For the first time, it will ask you how you want to sign the build, Automatically or Manually?

    Please Select Automatically and click the Next button.

    XCode may ask you to authenticate your certificate using your system password. Please authenticate it and wait until XCode uploads the build to TestFlight.

    Once the build is uploaded successfully, XCode will prompt you with the Success modal.

    Now, your app is uploaded to TestFlight and is being processed. This processing takes 5 to 15 minutes, at which point TestFlight makes it available for testing.

    Add Internal Testers and other teammates to TestFlight

    Once we are done with all the setup and uploaded the build to TestFlight, we need to add internal testers to TestFlight.

    This is a 2-step process. First, you need to add a user to App Store Connect and then add a user to TestFlight.

    Go to Users and Access

    Add a new User and App Store sends an invitation to the user

    Once the user accepts the invitation, go to TestFlight -> Internal Testing

    In the Internal Testing section, create a new Testing group if not added already and

    add the user to TestFlight testing group.

    Now, you should be able to configure the app, upload it to TestFlight, and add users to the TestFlight testing group.

    Hopefully, you enjoyed this article, and it helped in setting up iOS applications end-to-end quickly without getting too much confused. 

    Thanks.