Tag: dart

  • JNIgen: Simplify Native Integration in Flutter

    Prepare to embark on a groundbreaking journey through the realms of Flutter as we uncover the remarkable new feature—JNIgen. In this blog, we pull back the curtain to reveal JNIgen’s transformative power, from simplifying intricate tasks to amplifying scalability; this blog serves as a guiding light along the path to a seamlessly integrated Flutter ecosystem.

    As Flutter continues to mesmerize developers with its constant evolution, each release unveiling a treasure trove of thrilling new features, the highly anticipated Google I/O 2023 was an extraordinary milestone. Amidst the excitement, a groundbreaking technique was unveiled: JNIgen, offering effortless access to native code like never before.

    Let this blog guide you towards a future where your Flutter projects transcend limitations and manifest into awe-inspiring creations.

    1. What is JNIgen?

    JNIgen, which stands for Java native interface generator,  is an innovative tool that automates the process of generating Dart bindings for Android APIs accessible through Java or Kotlin code. By utilizing these generated bindings, developers can invoke Android APIs with a syntax that closely resembles native code.

    With JNIgen, developers can seamlessly bridge the gap between Dart and the rich ecosystem of Android APIs. This empowers them to leverage the full spectrum of Android’s functionality, ranging from system-level operations to platform-specific features. By effortlessly integrating with Android APIs through JNIgen-generated bindings, developers can harness the power of native code and build robust applications with ease.

    1.1. Default approach: 

    In the current Flutter framework, we rely on Platform channels to establish a seamless communication channel between Dart code and native code. These channels serve as a bridge for exchanging messages and data.

    Typically, we have a Flutter app acting as the client, while the native code contains the desired methods to be executed. The Flutter app sends a message containing the method name to the native code, which then executes the requested method and sends the response back to the Flutter app.

    However, this approach requires the manual implementation of handlers on both the Dart and native code sides. It entails writing code to handle method calls and manage the exchange of responses. Additionally, developers need to carefully manage method names and channel names on both sides to ensure proper communication.

    1.2. Working principle of JNIgen: 

    Figure 1

     

    In JNIgen, our native code path is passed to the JNIgen generator, which initiates the generation of an intermediate layer of C code. This C code is followed by the necessary boilerplate in Dart, facilitating access to the C methods. All data binding and C files are automatically generated in the directory specified in the .yaml file, which we will explore shortly.

    Consequently, as a Flutter application, our interaction is solely focused on interfacing with the newly generated Dart code, eliminating the need for direct utilization of native code.

    1.3. Similar tools: 

    During the Google I/O 2023 event, JNIgen was introduced as a tool for native code integration. However, it is important to note that not all external libraries available on www.pub.dev are developed exclusively using channels. Another tool, FFIgen, was introduced earlier at Google I/O 2021 and serves a similar purpose. Both FFIgen and JNIgen function similarly, converting native code into intermediate C code with corresponding Dart dependencies to establish the necessary connections.

    While JNIgen primarily facilitates communication between Android native code and Dart code, FFIgen has become the preferred choice for establishing communication between iOS native code and Dart code. Both tools are specifically designed to convert native code into intermediate code, enabling seamless interoperability within their respective platforms.

    2. Configuration

    Prior to proceeding with the code implementation, it is essential to set up and install the necessary tools.

    2.1. System setup: 

    2.1.1 Install MVN

    Windows

    • Download the Maven archive for Windows from the link here [download Binary zip archive]
    • After Extracting the zip file, you will get a folder with name “apache-maven-x.x.x”
    • Create a new folder with the name “ApacheMaven” in “C:Program Files” and paste the above folder in it. [Your current path will be “C:Program FilesApacheMavenapache-maven-x.x.x”]
    • Add the following entry in “Environment Variable” →  “User Variables”
      M2 ⇒ “C:Program FilesApacheMavenapache-maven-x.x.xbin”
      M2_HOME ⇒ “C:Program FilesApacheMavenapache-maven-x.x.x”
    • Add a new entry “%M2_HOME%bin” in “path” variable

    Mac

    • Download Maven archive for mac from the link here [download Binary tar.gz archive]
    • Run the following command where you have downloaded the *.tar.gz file
    tar -xvf apache-maven-3.8.7.bin.tar.gz

    • Add the following entry in .zshrc or .bash_profile to set Maven path “export PATH=”$PATH:/Users/username/Downloads/apache-maven-x.x.x/bin”

    Or

    • You can use brew to install llvm 
    brew install llvm

    • Brew will give you instruction like this for further setup
    ==> llvm
    To use the bundled libc++ please add the following LDFLAGS:
    LDFLAGS="-L/opt/homebrew/opt/llvm/lib/c++ -Wl,-rpath,/opt/homebrew/opt/llvm/lib/c++"
    
    llvm is keg-only, which means it was not symlinked into /opt/homebrew,
    because macOS already provides this software and installing another version in
    parallel can cause all kinds of trouble.
    
    If you need to have llvm first in your PATH, run:
    echo 'export PATH="/opt/homebrew/opt/llvm/bin:$PATH"' >> ~/.zshrc
    
    For compilers to find llvm you may need to set:
    export LDFLAGS="-L/opt/homebrew/opt/llvm/lib"
    export CPPFLAGS="-I/opt/homebrew/opt/llvm/include"

    2.1.1 Install Clang-Format

    Windows

    • Download the latest version of LLVM for windows from the link here

    Mac

    • Run the following brew command: 
    brew install clang-format

    2.2. Flutter setup: 

    2.2.1 Get Dependencies

    Run the following commands with Flutter:

    flutter pub add jni

    flutter pub add jnigen

    2.2.2 Setup configuration file

    Figure 01 provides a visual representation of the .yaml file, which holds crucial configurations utilized by JNIgen. These configurations serve the purpose of identifying paths for native classes, as well as specifying the locations where JNIgen generates the resulting C and Dart files. Furthermore, the .yaml file allows for specifying Maven configurations, enabling the selection of specific third-party libraries that need to be downloaded to facilitate code generation.

    By leveraging the power of the .yaml file, developers gain control over the path identification process and ensure that the generated code is placed in the desired locations. Additionally, the ability to define Maven configurations grants flexibility in managing dependencies, allowing the seamless integration of required third-party libraries into the generated code. This comprehensive approach enables precise control and customization over the code generation process, enhancing the overall efficiency and effectiveness of the development workflow.

    Let’s explore the properties that we have utilized within the .yaml file (Please refer “3.2.2. code implementation” section’s example for better understanding):

    • android_sdk_config: 

    When the value of a specific property is set to “true,” it triggers the execution of a Gradle stub during the invocation of JNIgen. Additionally, it includes the Android compile classpath in the classpath of JNIgen. However, to ensure that all dependencies are cached appropriately, it is necessary to have previously performed a release build.

    • output 

    As the name implies, the “output” section defines the configuration related to the generation of intermediate code. This section plays a crucial role in determining how the intermediate code will be generated and organized.

    •  c >> library_name &&  c >> path:
      Here we are setting details for c_based binding code.

    •  dart >> path &&  dart >> structure:

    Here we are defining configuration for dart_based binding code.

    •  source_path:

    These are specific directories that are scanned during the process of locating the relevant source files.

    •  classes:

    By providing a comprehensive list of classes or packages, developers can effectively control the scope of the code generation process. This ensures that the binding code is generated only for the desired components, minimizing unnecessary code generation

    By utilizing these properties within the .yaml file, developers can effectively control various aspects of the code generation process, including path identification, code organization, and dependency management. To get more in-depth information, please check out the official documentation here.

    2.3. Generate bindings files:

    Once this setup is complete, the final step for JNIgen is to obtain the jar file that will be scanned to generate the required bindings. To initiate the process of generating the Android APK, you can execute the following command:

    flutter build apk

    Run the following command in your terminal to generate code:

    dart run jnigen --config jnigen.yaml

    2.3. Android setup: 

    Add the address of CMakeLists.txt file in your android >> app >> build.gradle file’s buildTypes section:

    buildTypes {
            externalNativeBuild {
                cmake {
                    path <address of CMakeLists.txt>
                }
            }
        }

    With this configuration, we are specifying the path for the CMake file that will been generated by JNIgen.This path declaration is crucial for identifying the location of the generated CMake file within the project structure.

    With the completion of the aforementioned steps, you are now ready to run your application and leverage all the native functions that have been integrated.

    3. Sample Project

    To gain hands-on experience and better understand the JNIgen, let’s create a small project together. Follow the steps below to get started. 

    Let’s start with:

    3.1. Packages & directories:

    3.1.1 Create a project using the following command:

    flutter create jnigen_integration_project

    3.1.2 Add these under dependencies of pubspec.yaml (and run command flutter pub get):

    jni: ^0.5.0
    jnigen: ^0.5.0

    3.1.3. Got to android >> app >> src >> main directory.

    3.1.4. Create directories inside the main as show below:

    Figure 02 

    3.2. Code Implementation:

    3.2.1 We will start with Android code. Create 2 files HardwareUtils.java & HardwareUtilsKotlin.kt inside the utils directory.

     HardwareUtilsKotlin.kt

    package com.hardware.utils
    
    import android.os.Build
    
    class HardwareUtilsKotlin {
    
       fun getHardwareDetails(): Map<String, String>? {
           val hardwareDetails: MutableMap<String, String> = HashMap()
           hardwareDetails["Language"] = "Kotlin"
           hardwareDetails["Manufacture"] = Build.MANUFACTURER
           hardwareDetails["Model No."] = Build.MODEL
           hardwareDetails["Type"] = Build.TYPE
           hardwareDetails["User"] = Build.USER
           hardwareDetails["SDK"] = Build.VERSION.SDK
           hardwareDetails["Board"] = Build.BOARD
           hardwareDetails["Version Code"] = Build.VERSION.RELEASE
           return hardwareDetails
       }
    }

     HardwareUtils.java 

    package com.hardware.utils;
    
    
    import android.os.Build;
    
    
    import java.util.HashMap;
    import java.util.Map;
    
    
    public class HardwareUtils {
    
    
       public Map<String, String> getHardwareDetails() {
           Map<String, String> hardwareDetails = new HashMap<String, String>();
           hardwareDetails.put("Language", "JAVA");
           hardwareDetails.put("Manufacture", Build.MANUFACTURER);
           hardwareDetails.put("Model No.", Build.MODEL);
           hardwareDetails.put("Type", Build.TYPE);
           hardwareDetails.put("User", Build.USER);
           hardwareDetails.put("SDK", Build.VERSION.SDK);
           hardwareDetails.put("Board", Build.BOARD);
           hardwareDetails.put("Version Code", Build.VERSION.RELEASE);
           return hardwareDetails;
       }
    
    
       public Map<String, String> getHardwareDetailsKotlin() {
           return new HardwareUtilsKotlin().getHardwareDetails();
       }
    
    
    }

    3.2.2 To provide the necessary configurations to JNIGen for code generation, we will create a .yaml file named JNIgen.yaml in the root of the project.

       jnigen.yaml 

    android_sdk_config:
     add_gradle_deps: true
    
    
    output:
     c:
       library_name: hardware_utils
       path: src/
     dart:
       path: lib/hardware_utils.dart
       structure: single_file
    
    
    source_path:
     - 'android/app/src/main/java'
    
    
    classes:
     - 'com.hardware.utils'

    3.2.3 Let’s generate C & Dart code.

    Execute the following command to create APK:

    flutter build apk

    After the successful execution of the above command, execute the following command:

    dart run jnigen --config jnigen.yaml

    3.2.4 Add the address of CMakeLists.txt in your android >> app >> build.gradle file’s buildTypes section as shown below :

    buildTypes {
            externalNativeBuild {
                cmake {
                    path "../../src/CMakeLists.txt"
                }
            }
      }

    3.2.5. Final step is to call the methods from Dart code, which was generated by JNIgen.

    To do this, replace the MyHomePage class code with the below code from main.dart file.

    class MyHomePage extends StatefulWidget {
     const MyHomePage({super.key, required this.title});
    
     final String title;
    
     @override
     State<MyHomePage> createState() => _MyHomePageState();
    }
    
    class _MyHomePageState extends State<MyHomePage> {
     String _hardwareDetails = '';
     String _hardwareDetailsKotlin = '';
     JObject activity = JObject.fromRef(Jni.getCurrentActivity());
    
     @override
     void initState() {
       JMap<JString, JString> deviceHardwareDetails =
           HardwareUtils().getHardwareDetails();
       _hardwareDetails = 'This device details from Java class:n';
       deviceHardwareDetails.forEach((key, value) {
         _hardwareDetails =
             '$_hardwareDetailsn${key.toDartString()} is ${value.toDartString()}';
       });
    
       JMap<JString, JString> deviceHardwareDetailsKotlin =
           HardwareUtils().getHardwareDetailsKotlin();
       _hardwareDetailsKotlin = 'This device details from Kotlin class:n';
       deviceHardwareDetailsKotlin.forEach((key, value) {
         _hardwareDetailsKotlin =
             '$_hardwareDetailsKotlinn${key.toDartString()} is ${value.toDartString()}';
       });
    
       setState(() {
         _hardwareDetails;
         _hardwareDetailsKotlin;
       });
       super.initState();
     }
    
     @override
     Widget build(BuildContext context) {
       return Scaffold(
         appBar: AppBar(
           title: Text(widget.title),
         ),
         body: Center(
           child: Column(
             mainAxisAlignment: MainAxisAlignment.center,
             children: <Widget>[
               Text(
                 _hardwareDetails,
                 textAlign: TextAlign.center,
               ),
               SizedBox(height: 20,),
               Text(
                 _hardwareDetailsKotlin,
                 textAlign: TextAlign.center,
               ),
             ],
           ),
         ),
       );
     }
    }

    After all of this, when we launch our app, we will see information about our Android device.

    4. Result

    For your convenience, the complete code for the project can be found here. Feel free to refer to this code repository for a comprehensive overview of the implementation details and to access the entirety of the source code.

    5. Conclusion

    In conclusion, we explored the limitations of the traditional approach to native API access in Flutter for mid to large-scale projects. Through our insightful exploration of JNIgen’s working principles, we uncovered its remarkable potential for simplifying the native integration process.

    By gaining a deep understanding of JNIgen’s inner workings, we successfully developed a sample project and provided detailed guidance on the essential setup requirements. Armed with this knowledge, developers can embrace JNIgen’s capabilities to streamline their native integration process effectively.

    We can say that JNIgen is a valuable tool for Flutter developers seeking to combine the power of Flutter’s cross-platform capabilities with the flexibility and performance benefits offered by native code. It empowers developers to build high-quality apps that seamlessly integrate platform-specific features and existing native code libraries, ultimately enhancing the overall user experience. 

    Hopefully, this blog post has inspired you to explore the immense potential of JNIgen in your Flutter applications. By harnessing the JNIgen, we can open doors to new possibilities.

    Thank you for taking the time to read through this blog!

    6. Reference

    1. https://docs.flutter.dev/
    2. https://pub.dev/packages/jnigen
    3. https://pub.dev/packages/jni
    4. https://github.com/dart-lang/jnigen
    5. https://github.com/dart-lang/jnigen#readme
    6. https://github.com/dart-lang/jnigen/wiki/Architecture-&-Design-Notes
    7. https://medium.com/simform-engineering/jnigen-an-easy-way-to-access-platform-apis-cb1fd3101e33
    8. https://medium.com/@marcoedomingos/the-ultimate-showdown-methodchannel-vs-d83135f2392d
  • Integrating Augmented Reality in a Flutter App to Enhance User Experience

    In recent years, augmented reality (AR) has emerged as a cutting-edge technology that has revolutionized various industries, including gaming, retail, education, and healthcare. Its ability to blend digital information with the real world has opened up a new realm of possibilities. One exciting application of AR is integrating it into mobile apps to enhance the user experience.

    In this blog post, we will explore how to leverage Flutter, a powerful cross-platform framework, to integrate augmented reality features into mobile apps and elevate the user experience to new heights.

    Understanding Augmented Reality:‍

    Before we dive into the integration process, let’s briefly understand what augmented reality is. Augmented reality is a technology that overlays computer-generated content onto the real world, enhancing the user’s perception and interaction with their environment. Unlike virtual reality (VR), which creates a fully simulated environment, AR enhances the real world by adding digital elements such as images, videos, and 3D models.

    The applications of augmented reality are vast and span across different industries. In gaming, AR has transformed mobile experiences by overlaying virtual characters and objects onto the real world. It has also found applications in areas such as marketing and advertising, where brands can create interactive campaigns by projecting virtual content onto physical objects or locations. AR has also revolutionized education by offering immersive learning experiences, allowing students to visualize complex concepts and interact with virtual models.

    In the upcoming sections, we will explore the steps to integrate augmented reality features into mobile apps using Flutter.

    ‍What is Flutter?‍

    Flutter is an open-source UI (user interface) toolkit developed by Google for building natively compiled applications for mobile, web, and desktop platforms from a single codebase. It allows developers to create visually appealing and high-performance applications with a reactive and customizable user interface.

    The core language used in Flutter is Dart, which is also developed by Google. Dart is a statically typed, object-oriented programming language that comes with modern features and syntax. It is designed to be easy to learn and offers features like just-in-time (JIT) compilation during development and ahead-of-time (AOT) compilation for optimized performance in production.

    Flutter provides a rich set of customizable UI widgets that enable developers to build beautiful and responsive user interfaces. These widgets can be composed and combined to create complex layouts and interactions, giving developers full control over the app’s appearance and behavior.

    Why Choose Flutter for AR Integration?

    Flutter, backed by Google, is a versatile framework that enables developers to build beautiful and performant cross-platform applications. Its rich set of UI components and fast development cycle make it an excellent choice for integrating augmented reality features. By using Flutter, developers can write a single codebase that runs seamlessly on both Android and iOS platforms, saving time and effort.

    Flutter’s cross-platform capabilities enable developers to write code once and deploy it on multiple platforms, including iOS, Android, web, and even desktop (Windows, macOS, and Linux).

    The Flutter ecosystem is supported by a vibrant community, offering a wide range of packages and plugins that extend its capabilities. These packages cover various functionalities such as networking, database integration, state management, and more, making it easy to add complex features to your Flutter applications.’

    ‍Steps to Integrate AR in a Flutter App:

    Step 1: Set Up Flutter Project:

    Assuming that you already have Flutter installed in your system, create a new Flutter project or open an existing one to start integrating AR features. If not, then follow this https://docs.flutter.dev/get-started/install to set up Flutter.

    Step 2: Add ar_flutter_plugin dependency:

    Update the pubspec.yaml file of your Flutter project and add the following line under the dependencies section:

    dependencies:
    ar_flutter_plugin: ^0.7.3.

    This step ensures that your Flutter project has the necessary dependencies to integrate augmented reality using the ar_flutter_plugin package.
    Run `flutter pub get` to fetch the package.

    Step 3: Initializing the AR View:

    Create a new Dart file for the AR screen. Import the required packages at the top of the file:

    Define a new class called ARScreen that extends StatefulWidget and State. This class represents the AR screen and handles the initialization and rendering of the AR view:

    class ArScreen extends StatefulWidget {  
      const ArScreen({Key? key}) : super(key: key);  
      @override  
      _ArScreenState createState() => _ArScreenState();
    }‍
      class _ArScreenState extends State<ArScreen> {  
      ARSessionManager? arSessionManager;  
      ARObjectManager? arObjectManager;  
      ARAnchorManager? arAnchorManager;‍  
        List<ARNode> nodes = [];  
      List<ARAnchor> anchors = [];‍  
        @override  
        void dispose() {    
        super.dispose();    
        arSessionManager!.dispose();  }‍  
        @override  
        Widget build(BuildContext context) {    
        return Scaffold(        
          appBar: AppBar(          
            title: const Text('Anchors & Objects on Planes'),        
          ),        
          body: Stack(children: [          
            ARView(        
              onARViewCreated: onARViewCreated,        
              planeDetectionConfig: PlaneDetectionConfig.horizontalAndVertical,          
            ),          
            Align(        
              alignment: FractionalOffset.bottomCenter,        
              child: Row(            
                mainAxisAlignment: MainAxisAlignment.spaceEvenly,            
                children: [              
                  ElevatedButton(                  
                    onPressed: onRemoveEverything,                  
                    child: const Text("Remove Everything")),            
                ]),          
            )        
          ]));  
      }

    Step 4: Add AR functionality:

    Create a method onARViewCreated for the onArCoreViewCreated callback. You can add “required” AR functionality in this method, such as loading 3D models or handling interactions. In our demo, we will be adding 3D models in AR on tap:

    void onARViewCreated(
          ARSessionManager arSessionManager,
          ARObjectManager arObjectManager,
          ARAnchorManager arAnchorManager,
          ARLocationManager arLocationManager) {
        this.arSessionManager = arSessionManager;
        this.arObjectManager = arObjectManager;
        this.arAnchorManager = arAnchorManager;
    
        this.arSessionManager!.onInitialize(
              showFeaturePoints: false,
              showPlanes: true,
              customPlaneTexturePath: "Images/triangle.png",
              showWorldOrigin: true,
            );
        this.arObjectManager!.onInitialize();
    
        this.arSessionManager!.onPlaneOrPointTap = onPlaneOrPointTapped;
        this.arObjectManager!.onNodeTap = onNodeTapped;
      }

    After this, create a method onPlaneOrPointTapped for handling interactions.

    Future<void> onPlaneOrPointTapped(
          List<ARHitTestResult> hitTestResults) async {
        var singleHitTestResult = hitTestResults.firstWhere(
            (hitTestResult) => hitTestResult.type == ARHitTestResultType.plane);
        var newAnchor =
            ARPlaneAnchor(transformation: singleHitTestResult.worldTransform);
        bool? didAddAnchor = await arAnchorManager!.addAnchor(newAnchor);
        if (didAddAnchor!) {
          anchors.add(newAnchor);
          // Add note to anchor
          var newNode = ARNode(
              type: NodeType.webGLB,
              uri:
    "https://github.com/KhronosGroup/glTF-Sample-Models/raw/master/2.0/Duck/glTF-Binary/Duck.glb",
              scale: Vector3(0.2, 0.2, 0.2),
              position: Vector3(0.0, 0.0, 0.0),
              rotation: Vector4(1.0, 0.0, 0.0, 0.0));
          bool? didAddNodeToAnchor = await arObjectManager!
              .addNode(newNode, planeAnchor: newAnchor);
          if (didAddNodeToAnchor!) {
            nodes.add(newNode);
          } else {
            arSessionManager!.onError("Adding Node to Anchor failed");
          }
        } else {
          arSessionManager!.onError("Adding Anchor failed");
        }
      }

    Finally, create a method for onRemoveEverything to remove all the elements on the screen.

    Future<void> onRemoveEverything() async {
           for (var anchor in anchors) {
          arAnchorManager!.removeAnchor(anchor);
        }
        anchors = [];
      }

    Step 5: Run the AR screen:

    In your app’s main entry point, set the ARScreen as the home screen:

    void main() {
      runApp(MyApp());
    }
    
    class MyApp extends StatelessWidget {
      @override
      Widget build(BuildContext context) {
        return MaterialApp(
          home: ARScreen(),
        );
      }
    }

    In the example below, we can observe the AR functionality implemented. We are loading a Duck 3D Model whenever the user taps on the screen. The plane is auto-detected, and once that is done, we can add a model to it. We also have a floating button to remove everything that is on the plane at the given moment.

    ‍Benefits of AR Integration:

    • Immersive User Experience: Augmented reality adds an extra dimension to user interactions, creating immersive and captivating experiences. Users can explore virtual objects within their real environment, leading to increased engagement and satisfaction.
    • Interactive Product Visualization: AR allows users to visualize products in real-world settings before making a purchase. They can view how furniture fits in their living space, try on virtual clothes, or preview architectural designs. This interactive visualization enhances decision-making and improves customer satisfaction.
    • Gamification and Entertainment: Augmented reality opens up opportunities for gamification and entertainment within apps. You can develop AR games, quizzes, or interactive storytelling experiences, providing users with unique and enjoyable content.
    • Marketing and Branding: By incorporating AR into your Flutter app, you can create innovative marketing campaigns and branding experiences. AR-powered product demonstrations, virtual try-ons, or virtual showrooms help generate excitement around your brand and products.

    Conclusion:

    Integrating augmented reality into a Flutter app brings a new level of interactivity and immersion to the user experience. Flutter’s versatility with AR frameworks like ARCore and ARKit, empowers developers to create captivating and innovative mobile applications. By following the steps outlined in this blog post, you can unlock the potential of augmented reality and deliver exceptional user experiences that delight and engage your audience. Embrace the possibilities of AR in Flutter and embark on a journey of exciting and immersive app development.

  • What’s New with Material 3 in Flutter: Discussing the Key Updates with an Example

    At Google I/O 2021, Google unveiled Material You, the next evolution of Material Design, along with Android 12. This update introduced Material Design 3 (M3), bringing a host of significant changes and improvements to the Material Design system. For Flutter developers, adopting Material 3 offers a seamless and consistent design experience across multiple platforms. In this article, we will delve into the key changes of Material 3 in Flutter and explore how it enhances the app development process.

    1. Dynamic Color:‍

    One of the notable features of Material 3 is dynamic color, which enables developers to apply consistent colors throughout their apps. By leveraging the Material Theme Builder web app or the Figma plugin, developers can visualize and create custom color schemes based on a given seed color. The dynamic color system ensures that colors from different tonal palettes are applied consistently across the UI, resulting in a harmonious visual experience.

    2. Typography:‍

    Material 3 simplifies typography by categorizing it into five key groups: Display, Headline, Title, Body, and Label. This categorization makes using different sizes within each group easier, catering to devices with varying screen sizes. The scaling of typography has also become consistent across the groups, offering a more streamlined and cohesive approach to implementing typography in Flutter apps.

    3. Shapes:‍

    Material 3 introduces a wider range of shapes, including squared, rounded, and rounded rectangular shapes. Previously circular elements, such as the FloatingActionButton (FAB), have now transitioned to a rounded rectangular shape. Additionally, widgets like Card, Dialog, and BottomSheet feature a more rounded appearance in Material 3. These shape enhancements give developers more flexibility in designing visually appealing and modern-looking user interfaces.

    4. Elevation:‍

    In Material Design 2, elevated components had shadows that varied based on their elevation values. Material 3 takes this a step further by introducing the surfaceTintColor color property. This property applies a color to the surface of elevated components, with the intensity varying based on the elevation value. By incorporating surfaceTintColor, elevated components remain visually distinguishable even without shadows, resulting in a more polished and consistent UI.

    Let’s go through each of them in detail.

    Dynamic Color

    Dynamic color in Flutter enables you to apply consistent colors throughout your app. It includes key and neutral colors from different tonal palettes, ensuring a harmonious UI experience. You can use tools like Material Theme Builder or Figma plugin to create a custom color scheme to visualize and generate dynamic colors. By providing a seed color in your app’s theme, you can easily create an M3 ColorScheme. For example, adding “colorSchemeSeed: Colors.green” to your app will result in a lighter green color for elements like the FloatingActionButton (FAB), providing a customized look for your app.

    // primarySwatch: Colors.blue,  
     useMaterial3: true,  
     colorSchemeSeed: Colors.green,
     ),

    Note:
    When using the colorSchemeSeed in Flutter, it’s important to note that if you have already defined a primarySwatch in your app’s theme, you may encounter an assertion error. The error occurs because colorSchemeSeed and primarySwatch should not be used together. To avoid this issue, ensure that you either remove the primarySwatch or set colorSchemeSeed to null when using the colorSchemeSeed feature in your Flutter app.

    Using Material 3

    Typography

    In Material 3, the naming of typography has been made simpler by dividing it into five main groups: 

    1. Display 
    2. Headline 
    3. Title 
    4. Body 
    5. Label

    Each group has a more descriptive role, making it easier to use different font sizes within a specific typography group. For example, instead of using names like bodyText1, bodyText2, and caption, Material 3 introduces names like BodyLarge, BodyMedium, and BodySmall. This improved naming system is particularly helpful when designing typography for devices with varying screen sizes.

    Shapes

    Material 3 introduces an expanded selection of shapes, including square, rounded, and rounded rectangular shapes. The Floating Action Button (FAB), which used to be circular, now has a rounded rectangular shape. Material buttons have transitioned from rounded rectangular to pill-shaped. Additionally, widgets such as Card, Dialog, and BottomSheet have adopted a more rounded appearance in Material 3.

    Elevation

    In Material 2, elevated components were accompanied by shadows, with the size of the shadow increasing as the elevation increased. Material 3 brings a new feature called surfaceTintColor. When applied to elevated components, the surface of these components takes on the specified color, with the intensity varying based on the elevation value. This property is now available for all elevated widgets in Flutter, alongside elevation and shadow properties.

    Here’s an example Flutter app that demonstrates the key changes in Material 3 regarding dynamic color, typography, shapes, and elevation. This example app includes a simple screen with a colored container and text, showcasing the usage of these new features:

    //main.dart
    import 'package:flutter/material.dart';
    void main() {
      runApp(MyApp());
    }
    class MyApp extends StatelessWidget {
      @override
      Widget build(BuildContext context) {
        return MaterialApp(
          debugShowCheckedModeBanner: false,
          theme: ThemeData(
            useMaterial3: true,
            colorSchemeSeed: Colors.green,
          ),
          home: const MyHomePage(),
        );
      }
    }
    class MyHomePage extends StatelessWidget {
      const MyHomePage({Key? key}) : super(key: key);
      @override
      Widget build(BuildContext context) {
        return Scaffold(
          appBar: AppBar(
            title: Text(
              'Material 3 Key Changes',
              style: Theme.of(context).textTheme.headlineSmall,
            ),
            elevation: 8,
            shadowColor: Theme.of(context).shadowColor,
          ),
          body: Container(
            width: double.infinity,
            height: 200,
            color: Theme.of(context).colorScheme.secondary,
            padding: const EdgeInsets.all(16.0),
            child: Center(
              child: Text(
                'Hello, Material 3!',
                style: Theme.of(context).textTheme.bodyLarge?.copyWith(
                      color: Colors.white,
                    ),
              ),
            ),
          ),
          floatingActionButton: FloatingActionButton(
            onPressed: () {},
            child: const Icon(Icons.add),
          ),
        );
      }
    }

    Conclusion:

    Material 3 represents a significant update to the Material Design system in Flutter, offering developers a more streamlined and consistent approach to app design. The dynamic color feature allows for consistent colors throughout the UI, while the simplified typography and expanded shape options provide greater flexibility in creating visually engaging interfaces. Moreover, the enhancements in elevation ensure a cohesive and polished look for elevated components.

    As Flutter continues to evolve and adapt to Material 3, developers can embrace these key changes to create beautiful, personalized, and accessible designs across different platforms. The Flutter team has been diligently working to provide full support for Material 3, enabling developers to migrate their existing Material 2 apps seamlessly. By staying up to date with the progress of Material 3 implementation in Flutter, developers can leverage its features to enhance their app development process and deliver exceptional user experiences.

    Remember, Material 3 is an exciting opportunity for Flutter developers to create consistent and unified UI experiences, and exploring its key changes opens up new possibilities for app design.