Streams in Java 8 with Examples

Streams In Java 8, the Streams concept was introduced to process objects of collections efficiently. It provides a streamlined way to perform operations on collections such as filtering, mapping, and aggregating data.

Differences between java.util.streams and java.io streams

The java.util.streams are designed for processing objects from collections, representing a stream of objects. On the other hand, java.io streams are used for handling binary and character data in files, representing streams of binary or character data. Therefore, java.io streams and java.util streams serve different purposes.

Difference between Collection and Stream

A Collection is used to represent a group of individual objects as a single entity. On the other hand, a Stream is used to process a group of objects from a collection sequentially.

To convert a Collection into a Stream, you can use the stream() method introduced in Java 8:

Stream<T> stream = collection.stream();

Once you have a Stream, you can process its elements in two phases:

  1. Configuration: Configuring the Stream pipeline using operations like filtering and mapping.Filtering: Use the filter() method to filter elements based on a boolean condition:
Stream<T> filteredStream = stream.filter(element -> elementCondition);

Mapping: Use the map() method to transform elements into another form:

Stream<R> mappedStream = stream.map(element -> mapFunction);

2. Processing: Performing terminal operations to produce a result or side-effect.

  • Collecting: Use the collect() method to collect Stream elements into a Collection:
List<T> collectedList = stream.collect(Collectors.toList());

Counting: Use the count() method to count the number of elements in the Stream:

long count = stream.count();

Sorting: Use the sorted() method to sort elements in the Stream:

List<T> sortedList = stream.sorted().collect(Collectors.toList());

Min and Max: Use min() and max() methods to find the minimum and maximum values:

Optional<T> min = stream.min(comparator);
Optional<T> max = stream.max(comparator);

Iteration: Use the forEach() method to iterate over each element in the Stream:

stream.forEach(element -> System.out.println(element));

Array Conversion: Use the toArray() method to convert Stream elements into an array:

T[] array = stream.toArray(size -> new T[size]);

Stream Creation: Use the Stream.of() method to create a Stream from specific values or arrays:

Stream<Integer> intStream = Stream.of(1, 2, 3, 4, 5);

Java Stream API Example

These examples demonstrate the basic operations and benefits of using Streams in Java 8 for efficient data processing.

Example with Filtering and Mapping

Consider filtering even numbers from a list using Streams:

List<Integer> numbers = Arrays.asList(1, 2, 3, 4, 5, 6, 7, 8, 9, 10);
List<Integer> evenNumbers = numbers.stream()
                                   .filter(num -> num % 2 == 0)
                                   .collect(Collectors.toList());
System.out.println("Even numbers: " + evenNumbers);

Example with Mapping

Transforming strings to uppercase using Streams:

List<String> names = Arrays.asList("John", "Jane", "Doe", "Alice");
List<String> upperCaseNames = names.stream()
                                  .map(name -> name.toUpperCase())
                                  .collect(Collectors.toList());
System.out.println("Upper case names: " + upperCaseNames);

These examples illustrate how Streams facilitate concise and efficient data processing in Java 8.

Additional Examples

Example with collect() Method

Collecting only even numbers from a list without Streams:

import java.util.*;

public class Test {
    public static void main(String[] args) {
        ArrayList<Integer> list = new ArrayList<>();
        for (int i = 0; i <= 10; i++) {
            list.add(i);
        }
        System.out.println("Original list: " + list);
        
        ArrayList<Integer> evenNumbers = new ArrayList<>();
        for (Integer num : list) {
            if (num % 2 == 0) {
                evenNumbers.add(num);
            }
        }
        System.out.println("Even numbers without Streams: " + evenNumbers);
    }
}

Collecting even numbers using Streams:

import java.util.*;
import java.util.stream.*;

public class Test {
    public static void main(String[] args) {
        ArrayList<Integer> list = new ArrayList<>();
        for (int i = 0; i <= 10; i++) {
            list.add(i);
        }
        System.out.println("Original list: " + list);
        
        List<Integer> evenNumbers = list.stream()
                                       .filter(num -> num % 2 == 0)
                                       .collect(Collectors.toList());
        System.out.println("Even numbers with Streams: " + evenNumbers);
    }
}

These updated examples showcase both traditional and streamlined approaches to handling collections in Java, emphasizing the efficiency and readability benefits of Java 8 Streams.

Streams in Java 8: Conclusion

The Java Stream API in Java 8 offers a powerful way to process collections with functional programming techniques. By simplifying complex operations like filtering and mapping, Streams enhance code clarity and efficiency. Embracing Streams empowers developers to write cleaner, more expressive Java code, making it a valuable tool for modern application development.

Click here

Method Reference in Java 8

Method Reference in Java 8 allows a functional interface method to be mapped to a specific method using the :: (double colon) operator. This technique simplifies the implementation of functional interfaces by directly referencing existing methods. The referenced method can be either a static method or an instance method. It’s important that the functional interface method and the specified method have matching argument types, while other elements such as return type, method name, and modifiers can differ.

If the specified method is a static method, the syntax is:

ClassName::methodName

If the method is an instance method, the syntax is:

ObjectReference::methodName

A functional interface can refer to a lambda expression and can also refer to a method reference. Therefore, a lambda expression can be replaced with a method reference, making method references an alternative syntax to lambda expressions.

Example with Lambda Expression

class Task {
    public static void main(String[] args) {
        Runnable r = () -> {
            for (int i = 0; i <= 10; i++) {
                System.out.println("Child Thread");
            }
        };
        Thread t = new Thread(r);
        t.start();

        for (int i = 0; i <= 10; i++) {
            System.out.println("Main Thread");
        }
    }
}

Example with Method Reference

class Task {
    public static void printChildThread() {
        for (int i = 0; i <= 10; i++) {
            System.out.println("Child Thread");
        }
    }

    public static void main(String[] args) {
        Runnable r = Task::printChildThread;
        Thread t = new Thread(r);
        t.start();

        for (int i = 0; i <= 10; i++) {
            System.out.println("Main Thread");
        }
    }
}

In the above example, the Runnable interface’s run() method is referring to the Task class’s static method printChildThread().

Method Reference to an Instance Method

interface Processor {
    void process(int i);
}

class Worker {
    public void display(int i) {
        System.out.println("From Method Reference: " + i);
    }

    public static void main(String[] args) {
        Processor p = i -> System.out.println("From Lambda Expression: " + i);
        p.process(10);

        Worker worker = new Worker();
        Processor p1 = worker::display;
        p1.process(20);
    }
}

In this example, the functional interface method process() is referring to the Worker class instance method display().

The main advantage of method references is that we can reuse existing code to implement functional interfaces, enhancing code reusability.

Constructor Reference in Java 8

We can use the :: (double colon) operator to refer to constructors as well.

Syntax:

ClassName::new
Example:
class Product {
    private String name;

    Product(String name) {
        this.name = name;
        System.out.println("Constructor Executed: " + name);
    }
}

interface Creator {
    Product create(String name);
}

class Factory {
    public static void main(String[] args) {
        Creator c = name -> new Product(name);
        c.create("From Lambda Expression");

        Creator c1 = Product::new;
        c1.create("From Constructor Reference");
    }
}

In this example, the functional interface Creator is referring to the Product class constructor.

Note: In method and constructor references, the argument types must match.

Here is the link for the Java 8 quiz:
Click here

Related Articles:

Functions in Java 8

Functions in Java 8 are same as predicates but offer the flexibility to return any type of result, limited to a single value per function invocation. Oracle introduced the Function interface in Java 8, housed within the java.util.function package, to facilitate the implementation of functions in Java applications. This interface contains a single method, apply().

Difference Between Predicate and Function

Predicate in java 8
  • Purpose: Used for conditional checks.
  • Parameters: Accepts one parameter representing the input argument type (Predicate<T>).
  • Return Type: Returns a boolean value.
  • Methods: Defines the test() method and includes default methods like and(), or(), and negate().
Functions in java 8
  • Purpose: Performs operations and returns a result.
  • Parameters: Accepts two type parameters: the input argument type and the return type (Function<T, R>).
  • Return Type: Can return any type of value.
  • Methods: Defines the apply() method for computation.

Example: Finding the Square of a Number

Let’s write a function to calculate the square of a given integer:

import java.util.function.*;

class Test {
    public static void main(String[] args) {
        Function<Integer, Integer> square = x -> x * x;
        System.out.println("Square of 5: " + square.apply(5));  // Output: Square of 5: 25
        System.out.println("Square of -3: " + square.apply(-3)); // Output: Square of -3: 9
    }
}

BiFunction

BiFunction is another useful functional interface in Java 8. It represents a function that accepts two arguments and produces a result. This is particularly useful when you need to combine or process two input values.

Example: Concatenating two strings

import java.util.function.*;

public class BiFunctionExample {
    public static void main(String[] args) {
        BiFunction<String, String, String> concat = (a, b) -> a + b;
        System.out.println(concat.apply("Hello, ", "world!"));  // Output: Hello, world!
    }
}

Summary of Java 8 Functional Interfaces

  1. Predicate<T>:
    • Purpose: Conditional checks.
    • Method: boolean test(T t)
    • Example: Checking if a number is positive.
  2. Function<T, R>:
    • Purpose: Transforming data.
    • Method: R apply(T t)
    • Example: Converting a string to its length.
  3. BiFunction<T, U, R>:
    • Purpose: Operations involving two inputs.
    • Method: R apply(T t, U u)
    • Example: Adding two integers.

Conclusion

Java 8 functional interfaces like Predicate, Function, and BiFunction offer powerful tools for developers to write more concise and readable code. Understanding the differences and appropriate use cases for each interface allows for better application design and implementation.

By using these interfaces, you can leverage the power of lambda expressions to create cleaner, more maintainable code. Whether you are performing simple conditional checks or more complex transformations, Java 8 has you covered.

Here is the link for the Java 8 quiz:
Click here

Related Articles:

Predicate in Java 8 with Examples

Predicate in Java 8: A predicate is a function that takes a single argument and returns a boolean value. In Java, the Predicate interface was introduced in version 1.8 specifically for this purpose, as part of the java.util.function package. This interface serves as a functional interface, designed with a single abstract method: test().

Predicate Interface

The Predicate interface is defined as follows:

@FunctionalInterface
public interface Predicate<T> {
    boolean test(T t);
}

This interface allows the use of lambda expressions, making it highly suitable for functional programming practices.

Example 1: Checking if an Integer is Even

Let’s illustrate this with a simple example of checking whether an integer is even:

Traditional Approach:

public boolean test(Integer i) {
    return i % 2 == 0;
}

Lambda Expression:

Predicate<Integer> isEven = i -> i % 2 == 0;
System.out.println(isEven.test(4)); // Output: true
System.out.println(isEven.test(7)); // Output: false

Complete Predicate Program Example:

import java.util.function.Predicate;

public class TestPredicate {
    public static void main(String[] args) {
        Predicate<Integer> isEven = i -> i % 2 == 0;
        System.out.println(isEven.test(4));  // Output: true
        System.out.println(isEven.test(7));  // Output: false
        // System.out.println(isEven.test(true)); // Compile-time error
    }
}

More Predicate Examples

Example 2: Checking String Length

Here’s how you can determine if the length of a string exceeds a specified length:

Predicate<String> isLengthGreaterThanFive = s -> s.length() > 5;
System.out.println(isLengthGreaterThanFive.test("Generate")); // Output: true
System.out.println(isLengthGreaterThanFive.test("Java"));     // Output: false

Example 3: Checking Collection Emptiness

You can also check if a collection is not empty using a predicate:

import java.util.Collection;
import java.util.function.Predicate;

Predicate<Collection<?>> isNotEmpty = c -> !c.isEmpty();

Combining Predicates

Predicates can be combined using logical operations such as and(), or(), and negate(). This allows for building more complex conditions.

Example 4: Combining Predicates

Here’s an example demonstrating how to combine predicates:

import java.util.function.Predicate;

public class CombinePredicates {
    public static void main(String[] args) {
        int[] numbers = {0, 5, 10, 15, 20, 25, 30};

        Predicate<Integer> isGreaterThan10 = i -> i > 10;
        Predicate<Integer> isOdd = i -> i % 2 != 0;

        System.out.println("Numbers greater than 10:");
        filterNumbers(isGreaterThan10, numbers);

        System.out.println("Odd numbers:");
        filterNumbers(isOdd, numbers);

        System.out.println("Numbers not greater than 10:");
        filterNumbers(isGreaterThan10.negate(), numbers);

        System.out.println("Numbers greater than 10 and odd:");
        filterNumbers(isGreaterThan10.and(isOdd), numbers);

        System.out.println("Numbers greater than 10 or odd:");
        filterNumbers(isGreaterThan10.or(isOdd), numbers);
    }

    public static void filterNumbers(Predicate<Integer> predicate, int[] numbers) {
        for (int number : numbers) {
            if (predicate.test(number)) {
                System.out.println(number);
            }
        }
    }
}

Predicate in Java 8: Using and(), or(), and negate() Methods

In Java programming, the Predicate interface from the java.util.function package offers convenient methods to combine and modify predicates, allowing developers to create more sophisticated conditions.

Example 1: Combining Predicates with and()

The and() method enables the combination of two predicates. It creates a new predicate that evaluates to true only if both original predicates return true.

import java.util.function.Predicate;

public class CombinePredicatesExample {
    public static void main(String[] args) {
        Predicate<Integer> isGreaterThan10 = i -> i > 10;
        Predicate<Integer> isEven = i -> i % 2 == 0;

        // Combined predicate: numbers greater than 10 and even
        Predicate<Integer> isGreaterThan10AndEven = isGreaterThan10.and(isEven);

        // Testing the combined predicate
        System.out.println("Combined Predicate Test:");
        System.out.println(isGreaterThan10AndEven.test(12)); // Output: true
        System.out.println(isGreaterThan10AndEven.test(7));  // Output: false
        System.out.println(isGreaterThan10AndEven.test(9));  // Output: false
    }
}

Example 2: Combining Predicates with or()

The or() method allows predicates to be combined so that the resulting predicate returns true if at least one of the original predicates evaluates to true.

import java.util.function.Predicate;

public class CombinePredicatesExample {
    public static void main(String[] args) {
        Predicate<Integer> isEven = i -> i % 2 == 0;
        Predicate<Integer> isDivisibleBy3 = i -> i % 3 == 0;

        // Combined predicate: numbers that are either even or divisible by 3
        Predicate<Integer> isEvenOrDivisibleBy3 = isEven.or(isDivisibleBy3);

        // Testing the combined predicate
        System.out.println("Combined Predicate Test:");
        System.out.println(isEvenOrDivisibleBy3.test(6));  // Output: true
        System.out.println(isEvenOrDivisibleBy3.test(9));  // Output: true
        System.out.println(isEvenOrDivisibleBy3.test(7));  // Output: false
    }
}

Example 3: Negating a Predicate with negate()

The negate() method returns a predicate that represents the logical negation (opposite) of the original predicate.

import java.util.function.Predicate;

public class NegatePredicateExample {
    public static void main(String[] args) {
        Predicate<Integer> isEven = i -> i % 2 == 0;

        // Negated predicate: numbers that are not even
        Predicate<Integer> isNotEven = isEven.negate();

        // Testing the negated predicate
        System.out.println("Negated Predicate Test:");
        System.out.println(isNotEven.test(3));  // Output: true
        System.out.println(isNotEven.test(6));  // Output: false
    }
}

and() Method: Combines two predicates so that both conditions must be true for the combined predicate to return true.

or() Method: Creates a predicate that returns true if either of the two predicates is true.

negate() Method: Returns a predicate that represents the logical negation (inverse) of the original predicate.

Best Practices for Using Predicate in Java 8

  1. Descriptive Names: Use descriptive variable names for predicates to enhance code readability (e.g., isEven, isLengthGreaterThanFive).
  2. Conciseness: Keep lambda expressions concise and avoid complex logic within them.
  3. Combination: Utilize and(), or(), and negate() methods to compose predicates for more refined conditions.
  4. Stream Operations: Predicates are commonly used in stream operations for filtering elements based on conditions.
  5. Null Handling: Consider null checks if predicates may encounter null values.
  6. Documentation: Document predicates, especially those with complex logic, to aid understanding for others and future reference.

Conclusion

Predicates in Java provide a powerful mechanism for testing conditions on objects, offering flexibility and efficiency in code design. By leveraging lambda expressions and method references, developers can write cleaner and more expressive code. Start incorporating predicates into your Java projects to streamline logic and improve maintainability.

Java 8 Quiz
Here is the link for the Java 8 quiz:
Click here

Related Articles:

Default Methods in Interfaces in Java 8 Examples

Default Methods in Interfaces in Java 8 with Examples

Until Java 1.7, inside an interface, we could only define public abstract methods and public static final variables. Every method present inside an interface is always public and abstract, whether we declare it or not. Similarly, every variable declared inside an interface is always public, static, and final, whether we declare it or not. With the introduction of default methods in interfaces, it is now possible to include method implementations within interfaces, providing more flexibility and enabling new design patterns.

From Java 1.8 onwards, in addition to these, we can declare default concrete methods inside interfaces, also known as defender methods.

We can declare a default method using the keyword default as follows:

default void m1() {
    System.out.println("Default Method");
}

Interface default methods are by default available to all implementation classes. Based on the requirement, an implementation class can use these default methods directly or override them.

Default Methods in Interfaces Example:

interface ExampleInterface {
    default void m1() {
        System.out.println("Default Method");
    }
}

class ExampleClass implements ExampleInterface {
    public static void main(String[] args) {
        ExampleClass example = new ExampleClass();
        example.m1();
    }
}

Default methods are also known as defender methods or virtual extension methods. The main advantage of default methods is that we can add new functionality to the interface without affecting the implementation classes (backward compatibility).

Note: We can’t override Object class methods as default methods inside an interface; otherwise, we get a compile-time error.

Example:

interface InvalidInterface {
    default int hashCode() {
        return 10;
    }
}

Compile-Time Error: The reason is that Object class methods are by default available to every Java class, so it’s not required to bring them through default methods.

Default Method vs Multiple Inheritance

Two interfaces can contain default methods with the same signature, which may cause an ambiguity problem (diamond problem) in the implementation class. To overcome this problem, we must override the default method in the implementation class; otherwise, we get a compile-time error.

Example 1:

interface Left {
    default void m1() {
        System.out.println("Left Default Method");
    }
}

interface Right {
    default void m1() {
        System.out.println("Right Default Method");
    }
}

class CombinedClass implements Left, Right {
    public void m1() {
        System.out.println("Combined Class Method");
    }

    public static void main(String[] args) {
        CombinedClass combined = new CombinedClass();
        combined.m1();
    }
}

Example 2:

class CombinedClass implements Left, Right {
    public void m1() {
        Left.super.m1();
    }

    public static void main(String[] args) {
        CombinedClass combined = new CombinedClass();
        combined.m1();
    }
}

Differences between Interface with Default Methods and Abstract Class

Even though we can add concrete methods in the form of default methods to the interface, it won’t be equal to an abstract class.

Interface with Default MethodsAbstract Class
Every variable is always public static final.May contain instance variables required by child classes.
Does not talk about the state of the object.Can talk about the state of the object.
Cannot declare constructors.Can declare constructors.
Cannot declare instance and static blocks.Can declare instance and static blocks.
Functional interface with default methods can refer to
lambda expressions.
Cannot refer to lambda expressions.
Cannot override Object class methods.Can override Object class methods.
Differences Between Interfaces with Default Methods and Abstract Classes in Java 8

Static Methods in Java 8 Inside Interface

From Java 1.8 onwards, we can write static methods inside an interface to define utility functions. Interface static methods are by default not available to the implementation classes. Therefore, we cannot call interface static methods using an implementation class reference. We should call interface static methods using the interface name.

interface UtilityInterface {
    public static void sum(int a, int b) {
        System.out.println("The Sum: " + (a + b));
    }
}

class UtilityClass implements UtilityInterface {
    public static void main(String[] args) {
        UtilityInterface.sum(10, 20);
    }
}

As interface static methods are not available to the implementation class, the concept of overriding is not applicable. We can define exactly the same method in the implementation class, but it’s not considered overriding.

Example 1:

interface StaticMethodInterface {
    public static void m1() {}
}

class StaticMethodClass implements StaticMethodInterface {
    public static void m1() {}
}

Example 2:

interface StaticMethodInterface {
    public static void m1() {}
}

class StaticMethodClass implements StaticMethodInterface {
    public void m1() {}
}

This is valid but not considered overriding.

Example 3:

class ParentClass {
    private void m1() {}
}

class ChildClass extends ParentClass {
    public void m1() {}
}

This is valid but not considered overriding.

From Java 1.8 onwards, we can write the main() method inside an interface, and hence we can run the interface directly from the command prompt.

Example:

interface MainMethodInterface {
    public static void main(String[] args) {
        System.out.println("Interface Main Method");
    }
}

At the command prompt:

javac MainMethodInterface.java
java MainMethodInterface

Differences between Interface with Default Methods and Abstract Class

In conclusion, while interfaces with default methods offer some of the functionalities of abstract classes, there are still distinct differences between the two, particularly in terms of handling state, constructors, and method overriding capabilities.

Static Methods Inside Interface

It is important to note that interface static methods cannot be overridden. Here is another example illustrating this concept:

Example:

interface CalculationInterface {
    public static void calculate(int a, int b) {
        System.out.println("Calculation: " + (a + b));
    }
}

class CalculationClass implements CalculationInterface {
    public static void calculate(int a, int b) {
        System.out.println("Calculation (class): " + (a * b));
    }

    public static void main(String[] args) {
        CalculationInterface.calculate(10, 20);  // Calls the interface static method
        CalculationClass.calculate(10, 20);      // Calls the class static method
    }
}

In this example, CalculationInterface.calculate() and CalculationClass.calculate() are two separate methods, and neither overrides the other.

Main Method in Interface

From Java 1.8 onwards, we can write a main() method inside an interface and run the interface directly from the command prompt. This feature can be useful for testing purposes.

Differences between Interface with Default Methods and Abstract Class (Continued)

In conclusion, while interfaces with default methods offer some of the functionalities of abstract classes, there are still distinct differences between the two, particularly in terms of handling state, constructors, and method overriding capabilities.

Static Methods Inside Interface (Continued)

It is important to note that interface static methods cannot be overridden. Here is another example illustrating this concept:

Example:

interface CalculationInterface {
    public static void calculate(int a, int b) {
        System.out.println("Calculation: " + (a + b));
    }
}

class CalculationClass implements CalculationInterface {
    public static void calculate(int a, int b) {
        System.out.println("Calculation (class): " + (a * b));
    }

    public static void main(String[] args) {
        CalculationInterface.calculate(10, 20);  // Calls the interface static method
        CalculationClass.calculate(10, 20);      // Calls the class static method
    }
}

In this example, CalculationInterface.calculate() and CalculationClass.calculate() are two separate methods, and neither overrides the other.

Main Method in Interface

From Java 1.8 onwards, we can write a main() method inside an interface and run the interface directly from the command prompt. This feature can be useful for testing purposes.

Example:

interface ExecutableInterface {
    public static void main(String[] args) {
        System.out.println("Interface Main Method");
    }
}

To compile and run the above code from the command prompt:

javac ExecutableInterface.java
java ExecutableInterface

Additional Points to Consider

  1. Multiple Inheritance in Interfaces:
    • Interfaces in Java support multiple inheritance, which means a class can implement multiple interfaces. This is particularly useful when you want to design a class that conforms to multiple contracts.
  2. Resolution of Default Methods:
    • If a class implements multiple interfaces with conflicting default methods, the compiler will throw an error, and the class must provide an implementation for the conflicting methods to resolve the ambiguity.

Example:

interface FirstInterface {
    default void show() {
        System.out.println("FirstInterface Default Method");
    }
}

interface SecondInterface {
    default void show() {
        System.out.println("SecondInterface Default Method");
    }
}

class ConflictResolutionClass implements FirstInterface, SecondInterface {
    @Override
    public void show() {
        System.out.println("Resolved Method");
    }

    public static void main(String[] args) {
        ConflictResolutionClass obj = new ConflictResolutionClass();
        obj.show();  // Calls the resolved method
    }
}

3. Functional Interfaces with Default Methods:

  • A functional interface is an interface with a single abstract method, but it can still have multiple default methods. This combination allows you to provide a default behavior while still adhering to the functional programming paradigm.
@FunctionalInterface
interface FunctionalExample {
    void singleAbstractMethod();

    default void defaultMethod1() {
        System.out.println("Default Method 1");
    }

    default void defaultMethod2() {
        System.out.println("Default Method 2");
    }
}

class FunctionalExampleClass implements FunctionalExample {
    @Override
    public void singleAbstractMethod() {
        System.out.println("Implemented Abstract Method");
    }

    public static void main(String[] args) {
        FunctionalExampleClass obj = new FunctionalExampleClass();
        obj.singleAbstractMethod();
        obj.defaultMethod1();
        obj.defaultMethod2();
    }
}

Summary

Java 8 introduced significant enhancements to interfaces, primarily through the addition of default and static methods. These changes allow for more flexible and backward-compatible API design. Here are the key points:

  • Default Methods: Provide concrete implementations in interfaces without affecting existing implementing classes.
  • Static Methods: Allow utility methods to be defined within interfaces.
  • Main Method in Interfaces: Enables testing and execution of interfaces directly.
  • Conflict Resolution: Requires explicit resolution of conflicting default methods from multiple interfaces.
  • Functional Interfaces: Can have default methods alongside a single abstract method, enhancing their utility in functional programming.

These features make Java interfaces more powerful and versatile, facilitating more robust and maintainable code design.

Here is the link for the Java 8 quiz:
Click here

Related Articles:

Java 21 Features

  1. Java 21 Features With Examples
  2. Java 21 Pattern Matching for Switch Example
  3. Java 21 Unnamed Patterns and Variables with Examples
  4. Java 21 Unnamed Classes and Instance Main Methods
  5. Java String Templates in Java 21: Practical Examples
  6. Sequenced Collections in Java 21

Record Classes in Java 17

How to Create a Custom Starter with Spring Boot 3

How to Create a Custom Starter with Spring Boot 3

Today, we’ll explore how to create a custom starter with Spring Boot 3. This custom starter simplifies the setup and configuration process across different projects. By developing a custom starter, you can package common configurations and dependencies, ensuring they’re easily reusable in various Spring Boot applications. We’ll guide you through each step of creating and integrating this starter, harnessing the robust auto-configuration and dependency management features of Spring Boot.

Benefits of Creating Custom Starters:

  1. Modularity and Reusability:
    • Custom starters encapsulate reusable configuration, dependencies, and setup logic into a single module. This promotes modularity by isolating specific functionalities, making it easier to reuse across different projects.
  2. Consistency and Standardization:
    • By defining a custom starter, developers can enforce standardized practices and configurations across their applications. This ensures consistency in how components like databases, messaging systems, or integrations are configured and used.
  3. Reduced Boilerplate Code:
    • Custom starters eliminate repetitive setup tasks and boilerplate code. Developers can quickly bootstrap new projects by simply including the starter dependency, rather than manually configuring each component from scratch.
  4. Simplified Maintenance:
    • Centralizing configuration and dependencies in a custom starter simplifies maintenance. Updates or changes to common functionalities can be made in one place, benefiting all projects that use the starter.
  5. Developer Productivity:
    • Developers spend less time on initial setup and configuration, focusing more on implementing business logic and features. This accelerates development cycles and enhances productivity.

Step 1: Setting Up the Custom Messaging Starter

Let’s start by setting up a new Maven project named custom-messaging-starter.

  1. Setting Up the Maven ProjectCreate a new directory structure for your Maven project:
Custom Starter with Spring Boot 3

2. Define Dependencies and Configuration

Update the pom.xml file with necessary dependencies like spring-boot-starter and spring-boot-starter-amqp. This helps in managing RabbitMQ connections seamlessly.

<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
	xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd">
	<modelVersion>4.0.0</modelVersion>
	<parent>
		<groupId>org.springframework.boot</groupId>
		<artifactId>spring-boot-starter-parent</artifactId>
		<version>3.3.1</version>
		<relativePath/> <!-- lookup parent from repository -->
	</parent>
	<groupId>com.javadzone</groupId>
	<artifactId>custom-messaging-starter</artifactId>
	<version>0.0.1-SNAPSHOT</version>
	<name>custom-messaging-starter</name>
	<description>Creating custom starter using spring boot</description>

	<properties>
		<java.version>21</java.version>
	</properties>
	<dependencies>
		<dependency>
			<groupId>org.springframework.boot</groupId>
			<artifactId>spring-boot-starter-web</artifactId>
		</dependency>

		<dependency>
			<groupId>org.springframework.boot</groupId>
			<artifactId>spring-boot-starter-amqp</artifactId>
		</dependency>
		<dependency>
			<groupId>org.springframework.boot</groupId>
			<artifactId>spring-boot-autoconfigure</artifactId>
		</dependency>

		<dependency>
			<groupId>org.springframework.boot</groupId>
			<artifactId>spring-boot-starter-test</artifactId>
			<scope>test</scope>
		</dependency>
	</dependencies>

	<build>
		<plugins>
			<plugin>
				<groupId>org.springframework.boot</groupId>
				<artifactId>spring-boot-maven-plugin</artifactId>
			</plugin>
		</plugins>
	</build>

</project>

3. Creating Auto-Configuration

Develop the CustomMessagingAutoConfiguration class to configure RabbitMQ connections. This ensures that messaging between microservices is streamlined without manual setup.

@Configuration
public class CustomMessagingAutoConfiguration {

    @Bean
    @ConditionalOnMissingBean(ConnectionFactory.class)
    @ConfigurationProperties(prefix = "app.rabbitmq")
    public CachingConnectionFactory connectionFactory() {
        return new CachingConnectionFactory();
    }

    @Bean
    @ConditionalOnMissingBean(RabbitTemplate.class)
    public RabbitTemplate rabbitTemplate(ConnectionFactory connectionFactory) {
        return new RabbitTemplate(connectionFactory);
    }
}

4. Registering Auto-Configuration

Create a META-INF/spring.factories file within src/main/resources to register your auto-configuration class:

org.springframework.boot.autoconfigure.EnableAutoConfiguration=\
com.example.messaging.CustomMessagingAutoConfiguration

5. Building and Installing

Build and install your custom starter into the local Maven repository using the following command:

mvn clean install

Step 2: Using the Custom Messaging Starter in a Spring Boot Application

Now, let’s see how you can utilize this custom messaging starter (custom-messaging-starter) in a Spring Boot application (my-messaging-app).

  1. Adding DependencyInclude the custom messaging starter dependency in the pom.xml file of your Spring Boot application:
<dependencies>
    <dependency>
        <groupId>com.example</groupId>
        <artifactId>custom-messaging-starter</artifactId>
        <version>1.0.0</version>
    </dependency>
    <!-- Other dependencies -->
</dependencies>

2. Configuring RabbitMQ

Configure RabbitMQ connection properties in src/main/resources/application.properties:

app.rabbitmq.host=localhost
app.rabbitmq.port=5672
app.rabbitmq.username=guest
app.rabbitmq.password=guest

3. Using RabbitTemplate

Implement a simple application to send and receive messages using RabbitMQ:

@SpringBootApplication
public class CustomMessagingStarterApplication{

    private final RabbitTemplate rabbitTemplate;

    public CustomMessagingStarterApplication(RabbitTemplate rabbitTemplate) {
        this.rabbitTemplate = rabbitTemplate;
    }

    public static void main(String[] args) {
        SpringApplication.run(CustomMessagingStarterApplication.class, args);
    }

    @Bean
    public CommandLineRunner sendMessage() {
        return args -> {
            rabbitTemplate.convertAndSend("myQueue", "Hello, RabbitMQ!");
            System.out.println("Message sent to the queue.");
        };
    }
}

4. Setting Up a Listener

To demonstrate message receiving, add a listener component:

@Component
public class MessageListener {

    @RabbitListener(queues = "myQueue")
    public void receiveMessage(String message) {
        System.out.println("Received message: " + message);
    }
}

When to Create Custom Starter with Spring Boot 3 in Real-Time Applications:

  1. Complex Configuration Requirements:
    • When an application requires complex or specialized configurations that are consistent across multiple projects (e.g., database settings, messaging queues), a custom starter can abstract these configurations for easy integration.
  2. Cross-Project Consistency:
    • Organizations with multiple projects or microservices can use custom starters to enforce consistent practices and configurations, ensuring uniformity in how applications are developed and maintained.
  3. Encapsulation of Best Practices:
    • If your organization has established best practices or patterns for specific functionalities (e.g., logging, security, caching), encapsulating these practices in a custom starter ensures they are applied uniformly across applications.
  4. Third-Party Integrations:
    • Custom starters are beneficial when integrating with third-party services or APIs. They can encapsulate authentication methods, error handling strategies, and other integration specifics, simplifying the integration process for developers.
  5. Team Collaboration and Knowledge Sharing:
    • Creating custom starters promotes collaboration among teams by standardizing development practices. It also serves as a knowledge-sharing tool, allowing teams to document and share common configurations and setups.

Conclusion

By creating a custom Spring Boot starter for messaging with RabbitMQ, you streamline configuration management across projects. This encapsulation ensures consistency in messaging setups, reduces redundancy, and simplifies maintenance efforts. Custom starters are powerful tools for enhancing developer productivity and ensuring standardized practices in enterprise applications.

Related Articles:

  1. What is Spring Boot and Its Features
  2. Spring Boot Starter
  3. Spring Boot Packaging
  4. Spring Boot Custom Banner
  5. 5 Ways to Run Spring Boot Application
  6. @ConfigurationProperties Example: 5 Proven Steps to Optimize
  7. Mastering Spring Boot Events: 5 Best Practices
  8. Spring Boot Profiles Mastery: 5 Proven Tips
  9. CommandLineRunners vs ApplicationRunners
  10. Spring Boot Actuator: 5 Performance Boost Tips
  11. Spring Boot API Gateway Tutorial
  12. Apache Kafka Tutorial
  13. Spring Boot MongoDB CRUD Application Example
  14. ChatGPT Integration with Spring Boot
  15. RestClient in Spring 6.1 with Examples
  16. Spring Boot Annotations Best Practices

Top 50 Spring Boot Interview Questions and Answers

Top 50 Spring Boot Questions and Answers

Spring Boot is a popular framework for building Java applications quickly and efficiently. Whether you’re just starting or have been working with it for a while, you might have some questions. This blog post covers the top 50 Spring Boot Interview questions and answers to help you understand Spring Boot better.

Top 50 Spring Boot Questions and Answers

1. What is Spring Boot, and why should I use it?

Spring Boot is a framework built on top of the Spring Framework. It simplifies the setup and development of new Spring applications by providing default configurations and embedded servers, reducing the need for boilerplate code.

2. How do I create a Spring Boot application?

You can create a Spring Boot application using Spring Initializr (start.spring.io), an IDE like IntelliJ IDEA, or by using Spring Boot CLI:

  1. Go to Spring Initializr.
  2. Select your project settings (e.g., Maven, Java, Spring Boot version).
  3. Add necessary dependencies.
  4. Generate the project and unzip it.
  5. Open the project in your IDE and start coding.

3. What is the main class in a Spring Boot application?

The main class in a Spring Boot application is the entry point and is annotated with @SpringBootApplication. It includes the main method which launches the application using SpringApplication.run().

@SpringBootApplication
public class MyApplication {
    public static void main(String[] args) {
        SpringApplication.run(MyApplication.class, args);
    }
}

4. What does the @SpringBootApplication annotation do?

@SpringBootApplication is a convenience annotation that combines three annotations: @Configuration (marks the class as a source of bean definitions), @EnableAutoConfiguration (enables Spring Boot’s auto-configuration mechanism), and @ComponentScan (scans the package of the annotated class for Spring components).

5. How can you configure properties in a Spring Boot application?

You can configure properties in a Spring Boot application using application.properties or application.yml files located in the src/main/resources directory.

# application.properties
server.port=8081
spring.datasource.url=jdbc:mysql://localhost:3306/mydb

6. How do you handle exceptions in Spring Boot?

You can handle exceptions in Spring Boot using @ControllerAdvice and @ExceptionHandler annotations to create a global exception handler.

@ControllerAdvice
public class GlobalExceptionHandler {
    @ExceptionHandler(ResourceNotFoundException.class)
    public ResponseEntity<ErrorResponse> handleResourceNotFoundException(ResourceNotFoundException ex) {
        ErrorResponse errorResponse = new ErrorResponse("NOT_FOUND", ex.getMessage());
        return new ResponseEntity<>(errorResponse, HttpStatus.NOT_FOUND);
    }
}

7. What is Spring Boot Actuator and what are its benefits?

Spring Boot Actuator provides production-ready features such as health checks, metrics, and monitoring for your Spring Boot application. It includes various endpoints that give insights into the application’s health and environment.

8. How can you enable and use Actuator endpoints in a Spring Boot application?

Add the Actuator dependency in your pom.xml or build.gradle file:

<dependency>
    <groupId>org.springframework.boot</groupId>
    <artifactId>spring-boot-starter-actuator</artifactId>
</dependency>

Configure the endpoints in application.properties:

management.endpoints.web.exposure.include=health,info

9. What are Spring Profiles and how do you use them?

Spring Profiles allow you to segregate parts of your application configuration and make it only available in certain environments. You can activate profiles using the spring.profiles.active property.

# application-dev.properties
spring.datasource.url=jdbc:mysql://localhost:3306/devdb
# application-prod.properties
spring.datasource.url=jdbc:mysql://localhost:3306/proddb

10. How do you test a Spring Boot application?

Spring Boot supports testing with various tools and annotations like @SpringBootTest, @WebMvcTest, and @DataJpaTest. Use MockMvc to test MVC controllers without starting a full HTTP server.

@SpringBootTest
public class MyApplicationTests {
    @Test
    void contextLoads() {
    }
}

11. How can you secure a Spring Boot application?

You can secure a Spring Boot application using Spring Security. Add the dependency and configure security settings:

<dependency>
    <groupId>org.springframework.boot</groupId>
    <artifactId>spring-boot-starter-security</artifactId>
</dependency>

12. What is a Spring Boot Starter and why is it useful?

Spring Boot Starters are a set of convenient dependency descriptors you can include in your application. They provide a one-stop-shop for all the dependencies you need for a particular feature.

<dependency>
    <groupId>org.springframework.boot</groupId>
    <artifactId>spring-boot-starter-web</artifactId>
</dependency>

13. How can you configure a DataSource in Spring Boot?

You can configure a DataSource by adding properties in the application.properties file:

spring.datasource.url=jdbc:mysql://localhost:3306/mydb
spring.datasource.username=root
spring.datasource.password=secret
spring.datasource.driver-class-name=com.mysql.cj.jdbc.Driver

14. What is Spring Boot DevTools and how does it enhance development?

Spring Boot DevTools provides features to enhance the development experience, such as automatic restarts, live reload, and configurations for faster feedback loops. Add the dependency to your project:

<dependency>
    <groupId>org.springframework.boot</groupId>
    <artifactId>spring-boot-devtools</artifactId>
    <optional>true</optional>
</dependency>

15. How can you handle different environments in a Spring Boot application?

You can handle different environments using Spring Profiles. Define environment-specific properties files like application-dev.properties, application-prod.properties, and activate a profile using spring.profiles.active.

16. What are the differences between @Component, @Service, @Repository, and @Controller annotations?

These annotations are specializations of @Component:

  • @Component: Generic stereotype for any Spring-managed component.
  • @Service: Specialization for service layer classes.
  • @Repository: Specialization for persistence layer classes.
  • @Controller: Specialization for presentation layer (MVC controllers).

17. How can you create a RESTful web service using Spring Boot?

Use @RestController and @RequestMapping annotations to create REST endpoints.

@RestController
@RequestMapping("/api")
public class MyController {

    @GetMapping("/greeting")
    public String greeting() {
        return "Hello, World!";
    }
}

18. What is Spring Boot CLI and how is it used?

Spring Boot CLI is a command-line tool that allows you to quickly prototype with Spring. It supports Groovy scripts to write Spring applications.

$ spring init --dependencies=web my-app
$ cd my-app
$ spring run MyApp.groovy

19. How can you connect to a database using Spring Data JPA?

Add the necessary dependencies and create a repository interface extending JpaRepository.

<dependency>
    <groupId>org.springframework.boot</groupId>
    <artifactId>spring-boot-starter-data-jpa</artifactId>
</dependency>

public interface UserRepository extends JpaRepository<User, Long> {
}

20. How can you use the H2 Database for development and testing in Spring Boot?

Add the H2 dependency and configure the database settings in application.properties:

<dependency>
    <groupId>com.h2database</groupId>
    <artifactId>h2</artifactId>
    <scope>runtime</scope>
</dependency>
spring.datasource.url=jdbc:h2:mem:testdb
spring.datasource.driverClassName=org.h2.Driver
spring.datasource.username=sa
spring.datasource.password=password
spring.h2.console.enabled=true

21. What is the purpose of @Autowired?

@Autowired is used to inject beans (dependencies) automatically by Spring’s dependency injection mechanism. It can be used on constructors, fields, or setter methods.

22. How can you customize the Spring Boot banner?

You can customize the Spring Boot startup banner by placing a banner.txt file in the src/main/resources directory. You can also disable it entirely using spring.main.banner-mode=off in the application.properties file.

23. How can you create a custom starter in Spring Boot?

To create a custom starter, you need to create a new project with the necessary dependencies and configuration, then package it as a JAR. Include this JAR as a dependency in your Spring Boot application.

24. How do you run a Spring Boot application as a standalone jar?

Spring Boot applications can be packaged as executable JAR files with an embedded server. You can run the JAR using the command java -jar myapp.jar.

25. What are the best practices for logging in Spring Boot?

Use SLF4J with Logback as the default logging framework. Configure logging levels in application.properties and use appropriate logging levels (DEBUG, INFO, WARN, ERROR) in your code.

logging.level.org.springframework=INFO
logging.level.com.example=DEBUG

26. How do you externalize configuration in Spring Boot?

Externalize configuration using application.properties or application.yml files, environment variables, or command-line arguments. This allows you to manage application settings without changing the code.

27. How can you monitor Spring Boot applications?

Use Spring Boot Actuator to monitor applications. It provides endpoints for health checks, metrics, and more. Integrate with monitoring tools like Prometheus, Grafana, or ELK stack for enhanced monitoring.

28. How do you handle file uploads in Spring Boot?

Handle file uploads using MultipartFile in a controller method. Ensure you configure the spring.servlet.multipart properties in application.properties.

@PostMapping("/upload")
public String handleFileUpload(@RequestParam("file") MultipartFile file) {
    // handle the file
    return "File uploaded successfully!";
}

29. What is the purpose of @ConfigurationProperties?

@ConfigurationProperties is used to bind external configuration properties to a Java object. It’s useful for type-safe configuration.

@ConfigurationProperties(prefix = "app")
public class AppProperties {
    private String name;
    private String description;

    // getters and setters
}

30. How do you schedule tasks in Spring Boot?

Schedule tasks using @EnableScheduling and @Scheduled annotations. Define a method with the @Scheduled annotation to run tasks at specified intervals.

@EnableScheduling
public class SchedulingConfig {
}

@Component
public class ScheduledTasks {
    @Scheduled(fixedRate = 5000)
    public void reportCurrentTime() {
        System.out.println("Current time is " + new Date());
    }
}

31. How can you use Spring Boot with Kotlin?

Spring Boot supports Kotlin. Create a Spring Boot application using Kotlin by adding the necessary dependencies and configuring the project. Kotlin’s concise syntax can make the code more readable and maintainable.

32. What is Spring WebFlux?

Spring WebFlux is a reactive web framework in the Spring ecosystem, designed for building reactive and non-blocking web applications. It uses the Reactor project for its reactive support.

33. How do you enable CORS in Spring Boot?

Enable CORS (Cross-Origin Resource Sharing) using the @CrossOrigin annotation on controller methods or globally using a CorsConfiguration bean.

@RestController
@CrossOrigin(origins = "http://example.com")
public class MyController {
    @GetMapping("/greeting")
    public String greeting() {
        return "Hello, World!";
    }
}

34. How do you use Redis with Spring Boot?

Use Redis with Spring Boot by adding the spring-boot-starter-data-redis dependency and configuring Redis properties in application.properties.

spring.redis.host=localhost
spring.redis.port=6379

35. What is Spring Cloud and how is it related to Spring Boot?

Spring Cloud provides tools for building microservices and distributed systems on top of Spring Boot. It offers features like configuration management, service discovery, and circuit breakers.

36. How do you implement caching in Spring Boot?

Implement caching using the @EnableCaching annotation and a caching library like EhCache, Hazelcast, or Redis. Annotate methods with @Cacheable, @CachePut, and @CacheEvict for caching behavior.

@EnableCaching
public class CacheConfig {
}

@Service
public class UserService {
    @Cacheable("users")
    public User getUserById(Long id) {
        return userRepository.findById(id).orElse(null);
    }
}

37. How can you send emails with Spring Boot?

Send emails using Spring Boot by adding the spring-boot-starter-mail dependency and configuring email properties in application.properties. Use JavaMailSender to send emails.

spring.mail.host=smtp.example.com
spring.mail.port=587
spring.mail.username=user@example.com
spring.mail.password=secret
@Service
public class EmailService {
    @Autowired
    private JavaMailSender mailSender;

    public void sendSimpleMessage(String to, String subject, String text) {
        SimpleMailMessage message = new SimpleMailMessage();
        message.setTo(to);
        message.setSubject(subject);
        message.setText(text);
        mailSender.send(message);
    }
}

38. What is @SpringBootTest?

@SpringBootTest is an annotation that loads the full application context for integration tests. It is used to write tests that require Spring Boot’s features, like dependency injection and embedded servers.

39. How do you integrate Spring Boot with a front-end framework like Angular or React?

Integrate Spring Boot with front-end frameworks by building the front-end project and placing the static files in the src/main/resources/static directory of your Spring Boot project. Configure Spring Boot to serve these files.

40. How do you configure Thymeleaf in Spring Boot?

Thymeleaf is a templating engine supported by Spring Boot. Add the spring-boot-starter-thymeleaf dependency and place your templates in the src/main/resources/templates directory.

<dependency>
    <groupId>org.springframework.boot</groupId>
    <artifactId>spring-boot-starter-thymeleaf</artifactId>
</dependency>

41. What is the purpose of @SpringBootApplication?

@SpringBootApplication is a convenience annotation that combines @Configuration, @EnableAutoConfiguration, and @ComponentScan. It marks the main class of a Spring Boot application.

42. How do you use CommandLineRunner in Spring Boot?

CommandLineRunner is an interface used to execute code after the Spring Boot application starts. Implement the run method to perform actions on startup.

@Component
public class MyCommandLineRunner implements CommandLineRunner {
    @Override
    public void run(String... args) throws Exception {
        System.out.println("Hello, World!");
    }
}

43. How do you connect to an external REST API using Spring Boot?

Connect to an external REST API using RestTemplate or WebClient. RestTemplate is synchronous, while WebClient is asynchronous and non-blocking.

@RestController
@RequestMapping("/api")
public class ApiController {
    @Autowired
    private RestTemplate restTemplate;

    @GetMapping("/data")
    public String getData() {
        return restTemplate.getForObject("https://api.example.com/data", String.class);
    }
}

44. How do you implement pagination in Spring Boot?

Implement pagination using Spring Data JPA’s Pageable interface. Define repository methods that accept Pageable parameters.

public interface UserRepository extends JpaRepository<User, Long> {
    Page<User> findByLastName(String lastName, Pageable pageable);
}

45. How do you document a Spring Boot REST API?

Document a Spring Boot REST API using Swagger. Add the springfox-swagger2 and springfox-swagger-ui dependencies and configure Swagger.

<dependency>
    <groupId>io.springfox</groupId>
    <artifactId>springfox-swagger2</artifactId>
    <version>2.9.2</version>
</dependency>
<dependency>
    <groupId>io.springfox</groupId>
    <artifactId>springfox-swagger-ui</artifactId>
    <version>2.9.2</version>
</dependency>

46. How do you handle validation in Spring Boot?

Handle validation using the javax.validation package. Use annotations like @NotNull, @Size, and @Email in your model classes, and @Valid in your controller methods.

public class User {
    @NotNull
    private String name;
    @Email
    private String email;
}

47. How do you set up Spring Boot with Docker?

Set up Spring Boot with Docker by creating a Dockerfile that specifies the base image and instructions to build and run the application.

FROM openjdk:11-jre-slim
COPY target/myapp.jar myapp.jar
ENTRYPOINT ["java", "-jar", "/myapp.jar"]

48. How do you deploy a Spring Boot application to AWS?

Deploy a Spring Boot application to AWS by using services like Elastic Beanstalk, ECS, or Lambda. Package your application as a JAR or Docker image and upload it to the chosen service.

49. What is the difference between Spring Boot and Spring MVC?

Spring Boot is a framework for quickly building Spring-based applications with minimal configuration. Spring MVC is a framework for building web applications using the Model-View-Controller design pattern. Spring Boot often uses Spring MVC as part of its web starter.

50. How do you migrate a legacy application to Spring Boot?

Migrate a legacy application to Spring Boot by incrementally introducing Spring Boot dependencies and configurations. Replace legacy configurations with Spring Boot’s auto-configuration and starters, and gradually refactor the application to use Spring Boot features.

Spring Boot Interview Questions: Conclusion

Spring Boot is widely liked by developers because it’s easy to use and powerful. Learning from these top 50 questions and answers helps you understand Spring Boot better. You can solve many problems like setting up applications, connecting to databases, adding security, and putting your app on the cloud. Spring Boot makes these tasks simpler, helping you build better applications faster. Keep learning and enjoy coding with Spring Boot!

Related Articles:

  1. What is Spring Boot and Its Features
  2. Spring Boot Starter
  3. Spring Boot Packaging
  4. Spring Boot Custom Banner
  5. 5 Ways to Run Spring Boot Application
  6. @ConfigurationProperties Example: 5 Proven Steps to Optimize
  7. Mastering Spring Boot Events: 5 Best Practices
  8. Spring Boot Profiles Mastery: 5 Proven Tips
  9. CommandLineRunners vs ApplicationRunners
  10. Spring Boot Actuator: 5 Performance Boost Tips
  11. Spring Boot API Gateway Tutorial
  12. Apache Kafka Tutorial
  13. Spring Boot MongoDB CRUD Application Example
  14. ChatGPT Integration with Spring Boot
  15. RestClient in Spring 6.1 with Examples
  16. Spring Boot Annotations Best Practices

Spring Boot Annotations Best Practices

Spring Boot Annotations Best Practices

Introduction

Annotations are a powerful feature of the Spring Framework, offering a declarative way to manage configuration and behavior in your applications. They simplify the code and make it more readable and maintainable. However, misuse or overuse of annotations can lead to confusing and hard-to-maintain code. In this blog post, we’ll explore Spring Boot Annotations Best Practices, along with examples to illustrate these practices.

Understanding Annotations

Annotations in Spring Boot are metadata that provide data about a program. They can be applied to classes, methods, fields, and other program elements. Common annotations include @RestController, @Service, @Repository, @Component, and @Autowired. Each of these has specific use cases and best practices to ensure your application remains clean and maintainable.

Best Practices for Common Spring Boot Annotations:

@RestController and @Controller

Use @RestController for RESTful web services: This annotation combines @Controller and @ResponseBody, simplifying the creation of RESTful APIs.Best Practice: Separate your controller logic from business logic by delegating operations to service classes.

Example:

@RestController
@RequestMapping("/api")
public class MyController {

    private final MyService myService;

    @Autowired
    public MyController(MyService myService) {
        this.myService = myService;
    }

    @GetMapping("/hello")
    public String sayHello() {
        return myService.greet();
    }
}
@Service and @Component
  • Use @Service to denote service layer classes: This makes the purpose of the class clear and differentiates it from other components.
  • Best Practice: Use @Component for generic components that do not fit other stereotypes.

Example:

@Service
public class MyService {
    public String greet() {
        return "Hello, World!";
    }
}
@Repository
  • Use @Repository for Data Access Object (DAO) classes: This annotation marks the class as a DAO and enables exception translation.
  • Best Practice: Ensure your repository classes are only responsible for data access logic.

Example:

@Repository
public class MyRepository {
    // Data access methods
}
@Autowired
  • Prefer constructor injection over field injection: Constructor injection is better for testability and promotes immutability.
  • Best Practice: Use @RequiredArgsConstructor from Lombok to generate constructors automatically.

Example:

@Service
@RequiredArgsConstructor
public class MyService {

    private final MyRepository myRepository;

    public String process(String input) {
        // Business logic
        return "Processed " + input;
    }
}
@Configuration and @Bean
  • Use @Configuration to define configuration classes: These classes contain methods annotated with @Bean that produce Spring-managed beans.
  • Best Practice: Use explicit bean definitions over component scanning for better control and clarity.

Example:

@Configuration
public class AppConfig {

    @Bean
    public MyService myService() {
        return new MyService();
    }
}
@Value and @ConfigurationProperties
  • Use @Value for injecting simple properties: This annotation is useful for basic configuration values.
  • Use @ConfigurationProperties for structured configuration: This approach is cleaner for complex configuration data and supports validation.

Example:

@ConfigurationProperties(prefix = "app")
public class AppProperties {
    private String name;
    private int timeout;

    // Getters and setters
}
@SpringBootApplication
@EnableConfigurationProperties(AppProperties.class)
public class MyApplication {
    public static void main(String[] args) {
        SpringApplication.run(MyApplication.class, args);
    }
}

Custom Annotations

Creating custom annotations can help reduce boilerplate code and improve readability. For instance, if you frequently use a combination of annotations, you can create a custom composed annotation.

Example:

@Target(ElementType.TYPE)
@Retention(RetentionPolicy.RUNTIME)
@Documented
@Transactional
@Service
public @interface TransactionalService {
}

Usage:

@TransactionalService
public class MyTransactionalService {
    // Service methods
}

Meta-Annotations and Composed Annotations

Meta-annotations are annotations that can be applied to other annotations. They are useful for creating composed annotations that combine multiple annotations into one.

Example:

@Target(ElementType.METHOD)
@Retention(RetentionPolicy.RUNTIME)
@Documented
@PreAuthorize("hasRole('USER')")
@PostAuthorize("returnObject.user == principal.username")
public @interface UserAccess {
}

Advanced Usage

Conditional Annotations

Spring Boot provides conditional annotations like @ConditionalOnProperty and @ConditionalOnMissingBean that allow beans to be created based on specific conditions.

Example:

@Configuration
public class ConditionalConfig {

    @Bean
    @ConditionalOnProperty(name = "feature.enabled", havingValue = "true")
    public MyFeatureService myFeatureService() {
        return new MyFeatureService();
    }
}
Aspect-Oriented Programming (AOP) with Annotations

AOP can be used to add cross-cutting concerns like logging and transaction management. Annotations like @Aspect and @Around help in defining AOP logic.

Example:

@Aspect
@Component
public class LoggingAspect {

    @Around("execution(* com.example.service.*.*(..))")
    public Object logAround(ProceedingJoinPoint joinPoint) throws Throwable {
        // Logging logic
        return joinPoint.proceed();
    }
}

Handling Custom Validation with @Validated and @Valid
  • Use @Validated on service methods: This triggers validation on method parameters annotated with @Valid.
  • Best Practice: Combine @Validated with @Valid and custom validator annotations to ensure data integrity.

Example:

@Service
@Validated
public class MyService {

    public void createUser(@Valid User user) {
        // Service logic
    }
}

Using @Transactional for Transaction Management
  • Use @Transactional for managing transactions: This annotation ensures that the annotated method runs within a transaction context.
  • Best Practice: Apply @Transactional at the service layer, not the repository layer, to maintain transaction boundaries.

Example:

@Service
public class MyService {

    @Transactional
    public void performTransactionalOperation() {
        // Transactional logic
    }
}

Annotation Pitfalls and Anti-Patterns

  • Overuse of annotations: Using too many annotations can make your code hard to read and maintain. Use annotations judiciously.
  • Misuse of @Autowired: Avoid using @Autowired for circular dependencies. Prefer constructor injection to avoid this issue.
  • Business logic in annotated methods: Keep business logic in service classes rather than in methods annotated with @Controller or @RestController.

Conclusion

Annotations are a powerful tool in Spring Boot, but they should be used wisely. By following best practices, you can make your code more readable, maintainable, and testable. Regularly review your use of annotations to ensure they are helping rather than hindering your development process. Implement these best practices to harness the full potential of annotations in your Spring Boot applications.

By focusing on these detailed best practices and providing concrete examples, this blog post offers practical and actionable advice to Spring Boot developers looking to improve their use of annotations.

  1. What is Spring Boot and Its Features
  2. Spring Boot Starter
  3. Spring Boot Packaging
  4. Spring Boot Custom Banner
  5. 5 Ways to Run Spring Boot Application
  6. @ConfigurationProperties Example: 5 Proven Steps to Optimize
  7. Mastering Spring Boot Events: 5 Best Practices
  8. Spring Boot Profiles Mastery: 5 Proven Tips
  9. CommandLineRunners vs ApplicationRunners
  10. Spring Boot Actuator: 5 Performance Boost Tips
  11. Spring Boot API Gateway Tutorial
  12. Apache Kafka Tutorial
  13. Spring Boot MongoDB CRUD Application Example
  14. ChatGPT Integration with Spring Boot
  15. RestClient in Spring 6.1 with Examples

Share Your Thoughts

What are your go-to techniques for mastering annotation best practices in Spring Boot? Have you encountered any challenges or discovered unique approaches? We’d love to hear about your experiences and insights! Join the conversation by leaving your comments below.

Stay Updated!
Subscribe to our newsletter for more insightful articles on Spring Boot and Java development. Stay informed about the latest trends and best practices directly in your inbox.

RestClient in Spring 6 with Examples

RestClient in Spring 6.1

RestClient in Spring 6 introduces a synchronous HTTP client with a modern, fluent API. This new client provides a convenient way to convert between Java objects and HTTP requests/responses, offering an abstraction over various HTTP libraries. In this guide, we’ll explore how to create and use RestClient with simple, easy-to-understand examples.

Adding Dependencies

To get started with RestClient, you need to add the spring-boot-starter-web dependency to your pom.xml file:

<dependency>
    <groupId>org.springframework.boot</groupId>
    <artifactId>spring-boot-starter-web</artifactId>
</dependency>

Gradle

For a Gradle-based project, include the following dependency in your build.gradle file:

implementation 'org.springframework.boot:spring-boot-starter-web'

Configuring RestClient as a Spring Bean

To use RestClient effectively in your Spring application, it is recommended to define it as a Spring bean. This allows you to inject it into your services or controllers easily.

import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.web.client.RestClient;

@Configuration
public class RestClientConfig {

    @Bean
    public RestClient restClient() {
        return RestClient.builder().build();
    }
}

Using the RestClient

To make an HTTP request with RestClient, start by specifying the HTTP method. This can be done using method(HttpMethod) or convenience methods like get(), post(), etc.

1. GET Request Example

First, let’s see how to perform a simple GET request.

Example: GET Request

import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.stereotype.Service;
import org.springframework.web.client.RestClient;

@Service
public class ApiService {

    @Autowired
    private RestClient restClient;

    public String fetchData() {
        String response = restClient.get()
            .uri("https://api.example.com/data")
            .retrieve()
            .body(String.class);

        System.out.println(response);
        return response;
    }
}

In this example, we create a RestClient bean and inject it into our ApiService. We then use it to make a GET request to fetch data from https://api.example.com/data.

2. POST Request Example

Next, let’s see how to perform a POST request with a request body.

Example: POST Request

import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.http.MediaType;
import org.springframework.http.ResponseEntity;
import org.springframework.stereotype.Service;
import org.springframework.web.client.RestClient;

import java.util.Map;

@Service
public class OrderService {

    @Autowired
    private RestClient restClient;

    public ResponseEntity<Void> createOrder(Map<String, String> order) {
        return restClient.post()
            .uri("https://api.example.com/orders")
            .contentType(MediaType.APPLICATION_JSON)
            .body(order)
            .retrieve()
            .toBodilessEntity();
    }
}

In this example, we send a POST request to create a new order. The order data is passed as a Map<String, String> and converted to JSON automatically.

3. PUT Request Example

Let’s see how to perform a PUT request with a request body.

Example: PUT Request

import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.http.MediaType;
import org.springframework.http.ResponseEntity;
import org.springframework.stereotype.Service;
import org.springframework.web.client.RestClient;

import java.util.Map;

@Service
public class UpdateService {

    @Autowired
    private RestClient restClient;

    public ResponseEntity<Void> updateResource(int resourceId, Map<String, Object> updatedData) {
        return restClient.put()
            .uri("https://api.example.com/resources/{id}", resourceId)
            .contentType(MediaType.APPLICATION_JSON)
            .body(updatedData)
            .retrieve()
            .toBodilessEntity();
    }
}

In this example, we send a PUT request to update a resource identified by resourceId. The updated data is passed as a Map<String, Object> and converted to JSON automatically.

4. DELETE Request Example

Now, let’s see how to perform a DELETE request.

Example: DELETE Request

import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.http.ResponseEntity;
import org.springframework.stereotype.Service;
import org.springframework.web.client.RestClient;

@Service
public class DeleteService {

    @Autowired
    private RestClient restClient;

    public ResponseEntity<Void> deleteResource(int resourceId) {
        return restClient.delete()
            .uri("https://api.example.com/resources/{id}", resourceId)
            .retrieve()
            .toBodilessEntity();
    }
}

In this example, we send a DELETE request to delete a resource identified by resourceId.

Handling Responses

You can access the HTTP response status code, headers, and body using ResponseEntity.

Example: Accessing ResponseEntity

import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.http.ResponseEntity;
import org.springframework.stereotype.Service;
import org.springframework.web.client.RestClient;

@Service
public class UserService {

    @Autowired
    private RestClient restClient;

    public void getUserDetails() {
        ResponseEntity<String> responseEntity = restClient.get()
            .uri("https://api.example.com/users/1")
            .retrieve()
            .toEntity(String.class);

        System.out.println("Status code: " + responseEntity.getStatusCode());
        System.out.println("Headers: " + responseEntity.getHeaders());
        System.out.println("Body: " + responseEntity.getBody());
    }
}

RestClient in Spring 6: Error Handling

By default, RestClient throws a subclass of RestClientException for responses with 4xx or 5xx status codes. You can customize this behavior using onStatus.

Example: Custom Error Handling

import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.stereotype.Service;
import org.springframework.web.client.RestClient;
import org.springframework.web.client.RestClientException;

@Service
public class ErrorHandlingService {

    @Autowired
    private RestClient restClient;

    public String fetchDataWithErrorHandling() {
        try {
            return restClient.get()
                .uri("https://api.example.com/nonexistent")
                .retrieve()
                .onStatus(HttpStatusCode::is4xxClientError, response -> {
                    throw new CustomClientException("Client error: " + response.getStatusCode());
                })
                .body(String.class);
        } catch (RestClientException e) {
            e.printStackTrace();
            return "An error occurred";
        }
    }
}

Advanced Scenarios with Exchange

For advanced scenarios, RestClient provides access to the underlying HTTP request and response through the exchange() method. Status handlers are not applied when using exchange(), allowing for custom error handling.

Example: Advanced GET Request

import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.http.MediaType;
import org.springframework.stereotype.Service;
import org.springframework.web.client.RestClient;

@Service
public class AdvancedService {

    @Autowired
    private RestClient restClient;

    public Map<String, Object> getUser(int id) {
        return restClient.get()
            .uri("https://api.example.com/users/{id}", id)
            .accept(MediaType.APPLICATION_JSON)
            .exchange((request, response) -> {
                if (response.getStatusCode().is4xxClientError()) {
                    throw new CustomClientException("Client error: " + response.getStatusCode());
                } else {
                    ObjectMapper mapper = new ObjectMapper();
                    return mapper.readValue(response.getBody(), new TypeReference<Map<String, Object>>() {});
                }
            });
    }
}

Choosing Between RestTemplate vs RestClient vs WebClient

1. RestTemplate

  • Use Case:
    • Traditional Synchronous Applications: Use RestTemplate if you are working in a traditional Spring MVC application where synchronous HTTP calls suffice.
    • Simple CRUD Operations: For straightforward HTTP interactions such as fetching data from RESTful services using blocking calls.
  • Key Features:
    • Template-based API (getForObject, postForObject, etc.).
    • Synchronous blocking calls.
    • Well-established, widely used in existing Spring applications.
  • Example Scenario:
    • Integrating with legacy systems or existing codebases using synchronous HTTP communication.

2. RestClient

  • Use Case:
    • Modern Synchronous Applications: Choose RestClient for applications requiring more flexibility and control over HTTP requests and responses.
    • Enhanced Error Handling: When you need to handle specific HTTP status codes or exceptions with onStatus.
  • Key Features:
    • Fluent API (get, post, put, delete) with method chaining.
    • Built-in support for content negotiation and message converters.
    • Configurable request and response handling.
  • Example Scenario:
    • Building new applications in Spring Framework 6 that benefit from a modern, flexible synchronous HTTP client.
    • Customizing HTTP headers, request bodies, and error handling mechanisms.

3. WebClient

  • Use Case:
    • Reactive and Non-blocking Applications: Opt for WebClient in reactive applications leveraging Spring WebFlux.
    • High-Concurrency: When handling high volumes of requests concurrently with asynchronous processing.
  • Key Features:
    • Non-blocking and reactive API.
    • Functional style with operators like flatMap, map, etc., for composing requests and handling responses.
    • Supports both synchronous (blocking) and asynchronous (reactive) modes.
  • Example Scenario:
    • Developing microservices architectures or event-driven systems where responsiveness and scalability are critical.
    • Implementing real-time data streaming or processing pipelines using reactive programming principles.

Conclusion

RestClient in Spring Framework 6.1 offers a modern, fluent API for interacting with RESTful services. Its flexibility and ease of use make it a powerful tool for any Spring developer. Whether making simple GET requests or handling complex scenarios, RestClient provides the capabilities you need for efficient and effective HTTP communication.

By following this guide, you should now be well-equipped to use RestClient in your Spring applications, making your development process smoother and more efficient.

Related Articles

  1. What is Spring Boot and Its Features
  2. Spring Boot Starter
  3. Spring Boot Packaging
  4. Spring Boot Custom Banner
  5. 5 Ways to Run Spring Boot Application
  6. @ConfigurationProperties Example: 5 Proven Steps to Optimize
  7. Mastering Spring Boot Events: 5 Best Practices
  8. Spring Boot Profiles Mastery: 5 Proven Tips
  9. CommandLineRunners vs ApplicationRunners
  10. Spring Boot Actuator: 5 Performance Boost Tips
  11. Spring Boot API Gateway Tutorial
  12. Apache Kafka Tutorial
  13. Spring Boot MongoDB CRUD Application Example
  14. ChatGPT Integration with Spring Boot

Stay Updated!
Subscribe to our newsletter for more insightful articles on Spring Boot and Java development. Stay informed about the latest trends and best practices directly in your inbox.

Discovering Java’s Hidden Features for Better Code

Discovering Java’s Hidden Features for Better Code

Introduction

Java is a powerful language with numerous features that can enhance your coding experience. This post, titled “Discovering Java’s Hidden Features for Better Code,” uncovers lesser-known Java features to help you write better and more efficient code.

1. Optional.ofNullable for Safer Null Handling

Avoid NullPointerExceptions using Optional.ofNullable.

Example:

import java.util.Optional;

public class OptionalExample {
    public static void main(String[] args) {
        String value = null;
        Optional<String> optionalValue = Optional.ofNullable(value);

        optionalValue.ifPresentOrElse(
            v -> System.out.println("Value is: " + v),
            () -> System.out.println("Value is absent")
        );
    }
}

Output:

Value is absent

In this example, Optional.ofNullable checks if value is null and allows us to handle it without explicit null checks.

2. Using Streams for Simplified Data Manipulation

Java Streams API offers a concise way to perform operations on collections.

Advanced Stream Operations:

import java.util.Arrays;
import java.util.List;
import java.util.Map;
import java.util.stream.Collectors;

public class StreamExample {
    public static void main(String[] args) {
        List<String> names = Arrays.asList("Alice", "Bob", "Charlie", "David", "Edward");

        // Filter and Collect
        List<String> filteredNames = names.stream()
                                          .filter(name -> name.length() > 3)
                                          .collect(Collectors.toList());
        System.out.println("Filtered Names: " + filteredNames);

        // Grouping by length
        Map<Integer, List<String>> groupedByLength = names.stream()
                                                          .collect(Collectors.groupingBy(String::length));
        System.out.println("Grouped by Length: " + groupedByLength);
    }
}

Output:

Filtered Names: [Alice, Charlie, David, Edward]
Grouped by Length: {3=[Bob], 5=[Alice, David], 7=[Charlie, Edward]}

This demonstrates filtering a list and grouping by string length using streams, simplifying complex data manipulations.

3. Pattern Matching for Instanceof: Simplifying Type Checks

Introduced in Java 16, pattern matching for instanceof simplifies type checks and casts.

Real-World Example:

public class InstanceofExample {
    public static void main(String[] args) {
        Object obj = "Hello, World!";
        
        if (obj instanceof String s) {
            System.out.println("The string length is: " + s.length());
        } else {
            System.out.println("Not a string");
        }
    }
}

Output:

The string length is: 13

Pattern matching reduces boilerplate code and enhances readability by combining type check and cast in one step.

4. Compact Number Formatting for Readable Outputs

Java 12 introduced compact number formatting, ideal for displaying numbers in a human-readable format.

Example Usage:

import java.text.NumberFormat;
import java.util.Locale;

public class CompactNumberFormatExample {
    public static void main(String[] args) {
        NumberFormat compactFormatter = NumberFormat.getCompactNumberInstance(Locale.US, NumberFormat.Style.SHORT);
        String result = compactFormatter.format(1234567);
        System.out.println("Compact format: " + result);
    }
}

Output:

Compact format: 1.2M

This feature is useful for presenting large numbers in a concise and understandable manner, suitable for dashboards and reports.

5. Text Blocks for Clearer Multi-line Strings

Java 13 introduced text blocks, simplifying the handling of multi-line strings like HTML, SQL, and JSON.

Example Usage:

public class TextBlockExample {
    public static void main(String[] args) {
        String html = """
                      <html>
                          <body>
                              <h1>Hello, World!</h1>
                          </body>
                      </html>
                      """;
        System.out.println(html);
    }
}

Output:

<html>
    <body>
        <h1>Hello, World!</h1>
    </body>
</html>

Text blocks improve code readability by preserving the formatting of multi-line strings, making them easier to maintain and understand.

6. Unlocking Java’s Concurrent Utilities for Efficient Multithreading

The java.util.concurrent package offers robust utilities for concurrent programming, enhancing efficiency and thread safety.

Example Usage:

import java.util.Queue;
import java.util.concurrent.ConcurrentLinkedQueue;

public class ConcurrentLinkedQueueExample {
    public static void main(String[] args) {
        Queue<String> queue = new ConcurrentLinkedQueue<>();

        // Adding elements
        queue.add("Element1");
        queue.add("Element2");

        // Polling elements
        System.out.println("Polled: " + queue.poll());
        System.out.println("Polled: " + queue.poll());
    }
}

Output:

Polled: Element1
Polled: Element2

ConcurrentLinkedQueue is a thread-safe collection, ideal for concurrent applications where multiple threads access a shared collection.

7. Performance Tuning with Java Flight Recorder (JFR)

Java Flight Recorder (JFR) is a built-in feature of Oracle JDK and OpenJDK that provides profiling and diagnostic tools for optimizing Java applications.

Example Usage:

public class JFRDemo {
    public static void main(String[] args) throws InterruptedException {
        // Enable Java Flight Recorder (JFR)
        enableJFR();

        // Simulate application workload
        for (int i = 0; i < 1000000; i++) {
            String result = processRequest("Request " + i);
            System.out.println("Processed: " + result);
        }

        // Disable Java Flight Recorder (JFR)
        disableJFR();
    }

    private static String processRequest(String request) {
        // Simulate processing time
        try {
            Thread.sleep(10);
        } catch (InterruptedException e) {
            e.printStackTrace();
        }
        return "Processed " + request;
    }

    private static void enableJFR() {
        // Code to enable JFR
        // Example: -XX:+UnlockCommercialFeatures -XX:+FlightRecorder
    }

    private static void disableJFR() {
        // Code to disable JFR
        // Example: -XX:-FlightRecorder
    }
}

Explanation:

  • Enabling JFR: Configure JVM arguments like -XX:+UnlockCommercialFeatures -XX:+FlightRecorder to enable JFR. This allows JFR to monitor application performance metrics.
  • Simulating Workload: The processRequest method simulates a workload, such as handling requests. JFR captures data on CPU usage, memory allocation, and method profiling during this simulation.
  • Disabling JFR: After monitoring, disable JFR using -XX:-FlightRecorder to avoid overhead in production environments.

Java Flight Recorder captures detailed runtime information, including method profiling and garbage collection statistics, aiding in performance tuning and troubleshooting.

Discovering Java's Hidden Features for Better Code

8. Leveraging Method Handles for Efficient Reflection-Like Operations

Method handles provide a flexible and performant alternative to Java’s reflection API for method invocation and field access.

Before: How We Used to Code with Reflection

Before method handles were introduced, Java developers typically used reflection for dynamic method invocation. Here’s a simplified example of using reflection:

import java.lang.reflect.Method;

public class ReflectionExample {
    public static void main(String[] args) throws Exception {
        String str = "Hello, World!";
        Method method = String.class.getMethod("substring", int.class, int.class);
        String result = (String) method.invoke(str, 7, 12);
        System.out.println(result); // Output: World
    }
}

Reflection involves obtaining Method objects, which can be slower due to runtime introspection and type checks.

With Method Handles: Enhanced Performance and Flexibility

Method handles offer a more direct and efficient way to perform dynamic method invocations:

import java.lang.invoke.MethodHandle;
import java.lang.invoke.MethodHandles;
import java.lang.invoke.MethodType;

public class MethodHandlesExample {
    public static void main(String[] args) throws Throwable {
        MethodHandles.Lookup lookup = MethodHandles.lookup();
        MethodHandle mh = lookup.findVirtual(String.class, "substring", MethodType.methodType(String.class, int.class, int.class));

        String result = (String) mh.invokeExact("Hello, World!", 7, 12);
        System.out.println(result); // Output: World
    }
}

Output:

World

Method handles enable direct access to methods and fields, offering better performance compared to traditional reflection.

9. Discovering Java’s Hidden Features for Better Code:

Enhanced Date and Time Handling with java.time

Java 8 introduced the java.time package, providing a modern API for date and time manipulation, addressing shortcomings of java.util.Date and java.util.Calendar.

Example Usage:

import java.time.LocalDate;
import java.time.LocalTime;
import java.time.LocalDateTime;
import java.time.format.DateTimeFormatter;

public class DateTimeExample {
    public static void main(String[] args) {
        LocalDate date = LocalDate.now();
        LocalTime time = LocalTime.now();
        LocalDateTime dateTime = LocalDateTime.now();

        DateTimeFormatter formatter = DateTimeFormatter.ofPattern("yyyy-MM-dd HH:mm:ss");
        String formattedDateTime = dateTime.format(formatter);

        System.out.println("Current Date: " + date);
        System.out.println("Current Time: " + time);
        System.out.println("Formatted Date-Time: " + formattedDateTime);
    }
}

Output:

Current Date: 2024-06-15
Current Time: 14:23:45.123
Formatted Date-Time: 2024-06-15 14:23:45

The java.time API simplifies date and time handling with immutable and thread-safe classes, supporting various date-time operations and formatting.

Conclusion

By leveraging these hidden gems in Java, you can streamline your code, enhance performance, and simplify complex tasks. These features not only improve productivity but also contribute to writing cleaner, more maintainable Java applications. Embrace these tools and techniques to stay ahead in your Java development journey!

Java 8 Functional Interfaces: Features and Benefits

Java 8 Functional Interfaces: Features and Benefits

Java 8 functional interfaces, which are interfaces containing only one abstract method. The method itself is known as the functional method or Single Abstract Method (SAM). Examples include:

Predicate: Represents a predicate (boolean-valued function) of one argument. Contains only the test() method, which evaluates the predicate on the given argument.

Supplier: Represents a supplier of results. Contains only the get() method, which returns a result.

Consumer: Represents an operation that accepts a single input argument and returns no result. Contains only the accept() method, which performs the operation on the given argument.

Function: Represents a function that accepts one argument and produces a result. Contains only the apply() method, which applies the function to the given argument.

BiFunction: Represents a function that accepts two arguments and produces a result. Contains only the apply() method, which applies the function to the given arguments.

Runnable: Represents a task that can be executed. Contains only the run() method, which is where the task logic is defined.

Comparable: Represents objects that can be ordered. Contains only the compareTo() method, which compares this object with the specified object for order.

ActionListener: Represents an action event listener. Contains only the actionPerformed() method, which is invoked when an action occurs.

Callable: Represents a task that returns a result and may throw an exception. Contains only the call() method, which executes the task and returns the result.

Java 8 Functional Interfaces

Benefits of @FunctionalInterface Annotation

The @FunctionalInterface annotation was introduced to explicitly mark an interface as a functional interface. It ensures that the interface has only one abstract method and allows additional default and static methods.

In a functional interface, besides the single abstract method (SAM), any number of default and static methods can also be defined. For instance:

interface ExampleInterface {
    void method1(); // Abstract method

    default void method2() {
        System.out.println("Hello"); // Default method
    }
}

Java 8 introduced the @FunctionalInterface annotation to explicitly mark an interface as a functional interface:

@FunctionalInterface
interface ExampleInterface {
    void method1();
}

It’s important to note that a functional interface can have only one abstract method. If there are more than one abstract methods, a compilation error occurs.

Functional Interface in java

Inheritance in Functional Interfaces

If an interface extends a functional interface and does not contain any abstract methods itself, it remains a functional interface. For example:

@FunctionalInterface
interface A {
    void methodOne();
}

@FunctionalInterface
interface B extends A {
    // Valid to extend and not add more abstract methods
}

However, if the child interface introduces any new abstract methods, it ceases to be a functional interface and using @FunctionalInterface will result in a compilation error.

Lambda Expressions and Functional Interfaces:

Lambda expressions are used to invoke the functionality defined in functional interfaces. They provide a concise way to implement functional interfaces. For example:

Without Lambda Expression:

interface ExampleInterface {
    void methodOne();
}

class Demo implements ExampleInterface {
    public void methodOne() {
        System.out.println("Method one execution");
    }

    public class Test {
        public static void main(String[] args) {
            ExampleInterface obj = new Demo();
            obj.methodOne();
        }
    }
}

With Lambda Expression:

interface ExampleInterface {
    void methodOne();
}

class Test {
    public static void main(String[] args) {
        ExampleInterface obj = () -> System.out.println("Method one execution");
        obj.methodOne();
    }
}

Advantages of Lambda Expressions:

  1. They reduce code length, improving readability.
  2. They simplify complex implementations of anonymous inner classes.
  3. They can be used wherever functional interfaces are applicable.

Anonymous Inner Classes vs Lambda Expressions:

Lambda expressions are often used to replace anonymous inner classes, reducing code length and complexity. For example:

With Anonymous Inner Class:

class Test {
    public static void main(String[] args) {
        Thread t = new Thread(new Runnable() {
            public void run() {
                for (int i = 0; i < 10; i++) {
                    System.out.println("Child Thread");
                }
            }
        });
        t.start();
        for (int i = 0; i < 10; i++) {
            System.out.println("Main Thread");
        }
    }
}

With Lambda Expression:

class Test {
    public static void main(String[] args) {
        Thread t = new Thread(() -> {
            for (int i = 0; i < 10; i++) {
                System.out.println("Child Thread");
            }
        });
        t.start();
        for (int i = 0; i < 10; i++) {
            System.out.println("Main Thread");
        }
    }
}

Differences between Anonymous Inner Classes and Lambda Expressions

Anonymous Inner ClassLambda Expression
A class without a nameA method without a name (anonymous function)
Can extend concrete and abstract classesCannot extend concrete or abstract classes
Can implement interfaces with any number of methodsCan only implement interfaces with a single abstract method
Can declare instance variablesCannot declare instance variables; variables are treated as final
Has separate .class file generated at compilationNo separate .class file; converts into a private method
In summary, lambda expressions offer a concise and effective way to implement functional interfaces, enhancing code readability and reducing complexity compared to traditional anonymous inner classes.
Click here

Related Articles:

Java 8 Lambda Expressions with Examples

Java 8 Lambda Expressions with Examples

Lambda expressions in Java 8 are essentially unnamed functions without return types or access modifiers. They’re also known as anonymous functions or closures. Let’s explore Java 8 lambda expressions with examples.

Example 1:

public void m() {
    System.out.println("Hello world");
}

Can be express as:

Java 8 Lambda Expressions with Examples

() -> {
    System.out.println("Hello world");   
}

//or

() ->  System.out.println("Hello world");

Example 2:

public void m1(int i, int j) {
    System.out.println(i + j);
}

Can be expressed as:

(int i, int j) -> {
    System.out.println(i + j);
}

If the type of the parameters can be inferred by the compiler based on the context, we can omit the types. The above lambda expression can be rewritten as:

(i, j) ->  System.out.println(i+j);

Example 3:

Consider the following transformation:

public String str(String s) {
    return s;
}

can be expressed as:

(String s) -> return s;

or

(String s) -> s;

Conclusion:

  1. A lambda expression can have zero or more arguments (parameters).
  • Example:
() -> System.out.println("Hello world");
(int i) -> System.out.println(i);
(int i, int j) -> System.out.println(i + j);

2. We can specify the type of the parameter. If the compiler can infer the type based on the context, then we can omit the type.

Example:

(int a, int b) -> System.out.println(a + b);
(a, b) -> System.out.println(a + b);

3. If multiple parameters are present, they should be separated by a comma (,).

4. If no parameters are present, we use empty parentheses [ like () ].

Example:

() -> System.out.println("hello");

5. If only one parameter is present and if the compiler can infer the type, then we can omit the type and parentheses.

  • Example:
Java 8 Lambda Expressions with Examples

6. Similar to a method body, a lambda expression body can contain multiple statements. If there are multiple statements, they should be enclosed in curly braces {}. If there is only one statement, curly braces are optional.

7. Once we write a lambda expression, we can call that expression just like a method. To do this, functional interfaces are required.

This covers the basics of lambda expressions using in java 8 with relevant examples.

For more information, follow this link: Oracle’s guide on lambda expressions.

Here is the link for the Java 8 quiz:
Click here

Related Articles:

Java Interview Questions and Answers

Java Interview Questions and Answers

Prepare for your Java job interview with confidence! Explore a comprehensive collection of Java interview questions and answers covering essential topics such as object-oriented programming, data structures, concurrency, exception handling, and more.

Detailed Java Interview Questions and Answers

  1. What are the main features of Java?
    • Answer: Java features include simplicity, object-oriented nature, portability, robustness, security, multithreading capability, and high performance through Just-In-Time compilation.
  2. Explain the concept of OOP and its principles in Java.
    • Answer: OOP principles in Java include:
      • Encapsulation: Bundling data and methods that operate on the data within a single unit (class).
public class Person {
    private String name;  // Encapsulated field
    
    public String getName() {  // Public method to access the field
        return name;
    }
    
    public void setName(String name) {
        this.name = name;
    }
}

Abstraction: Hiding complex implementation details and showing only necessary features.

abstract class Animal {
    abstract void makeSound();  // Abstract method
}

class Dog extends Animal {
    void makeSound() {
        System.out.println("Bark");
    }
}

Inheritance: A new class inherits properties and behavior from an existing class.

class Animal {
    void eat() {
        System.out.println("This animal eats food");
    }
}

class Dog extends Animal {
    void bark() {
        System.out.println("Bark");
    }
}

Polymorphism: Methods do different things based on the object it is acting upon.

Animal myDog = new Dog();
myDog.makeSound();  // Outputs: Bark

3. What is the difference between JDK, JRE, and JVM?

  • JDK (Java Development Kit): Contains tools for developing Java applications (JRE, compiler, debugger).
  • JRE (Java Runtime Environment): Runs Java applications, includes JVM and standard libraries.
  • JVM (Java Virtual Machine): Executes Java bytecode and provides a runtime environment.

4. Describe the memory management in Java.

Java uses automatic memory management with garbage collection. Memory is divided into heap (for objects) and stack (for method calls and local variables).

5. What is the Java Memory Model?

The Java Memory Model defines how threads interact through memory, ensuring visibility, ordering, and atomicity of shared variables.

6. How does garbage collection work in Java?

Garbage collection automatically frees memory by removing objects that are no longer referenced. Algorithms include mark-and-sweep and generational collection.

7. What are the different types of references in Java?

  • Strong: Default type, prevents garbage collection.
  • Soft: Used for caches, collected before OutOfMemoryError.
  • Weak: Used for canonicalizing mappings, collected eagerly.
  • Phantom: Used for cleanup actions, collected after finalization.

8. Explain the finalize() method.

The finalize() method is called by the garbage collector before an object is collected. It’s used to clean up resources but is deprecated due to unpredictability.

9. What is the difference between == and equals() in Java?

  • == compares reference identity.
  • equals() compares object content.
String a = new String("hello");
String b = new String("hello");
System.out.println(a == b);  // false
System.out.println(a.equals(b));  // true

10. What is the hashCode() method? How is it related to equals()?

The hashCode() method returns an integer hash code for the object. If two objects are equal (equals() returns true), they must have the same hash code to ensure correct functioning in hash-based collections.

public class Person {
    private String name;

    @Override
    public boolean equals(Object obj) {
        if (this == obj) return true;
        if (obj == null || getClass() != obj.getClass()) return false;
        Person person = (Person) obj;
        return name.equals(person.name);
    }

    @Override
    public int hashCode() {
        return name.hashCode();
    }
}

11. Explain the use of the volatile keyword.

The volatile keyword ensures that the value of a variable is always read from main memory, not from a thread’s local cache. It guarantees visibility of changes to variables across threads.

private volatile boolean flag = true;

12. What are the differences between wait() and sleep()?

  • wait(): Causes the current thread to release the monitor lock and wait until another thread invokes notify() or notifyAll() on the same object.
  • sleep(): Causes the current thread to pause execution for a specified time without releasing the monitor lock.
synchronized (obj) {
    obj.wait();  // releases the lock on obj
}

Thread.sleep(1000);  // pauses the current thread for 1 second

13. What is the difference between notify() and notifyAll()?

  • notify(): Wakes up a single thread that is waiting on the object’s monitor.
  • notifyAll(): Wakes up all threads that are waiting on the object’s monitor.
synchronized (obj) {
    obj.notify();  // wakes up one waiting thread
}

synchronized (obj) {
    obj.notifyAll();  // wakes up all waiting threads
}

14. What is a deadlock? How can it be avoided?

A deadlock occurs when two or more threads are blocked forever, each waiting for the other to release a resource. It can be avoided by acquiring locks in a consistent order and using timeout for lock acquisition.

// Avoiding deadlock by acquiring locks in the same order
synchronized (lock1) {
    synchronized (lock2) {
        // critical section
    }
}

15. What are the different types of thread pools in Java?

  • FixedThreadPool: A fixed number of threads.
  • CachedThreadPool: Creates new threads as needed and reuses existing ones.
  • SingleThreadExecutor: A single worker thread.
  • ScheduledThreadPool: A pool that can schedule commands to run after a delay or periodically.
ExecutorService fixedPool = Executors.newFixedThreadPool(10);
ExecutorService cachedPool = Executors.newCachedThreadPool();
ExecutorService singleThreadExecutor = Executors.newSingleThreadExecutor();
ScheduledExecutorService scheduledPool = Executors.newScheduledThreadPool(5);

16. Explain the use of the Callable and Future interfaces.

Callable is similar to Runnable but can return a result and throw a checked exception. Future represents the result of an asynchronous computation, allowing us to retrieve the result once the computation is complete.

Callable<Integer> task = () -> {
    return 123;
};

ExecutorService executor = Executors.newFixedThreadPool(1);
Future<Integer> future = executor.submit(task);

Integer result = future.get();  // returns 123

Collections Framework

17. What is the Java Collections Framework?

The Java Collections Framework provides a set of interfaces (List, Set, Map) and implementations (ArrayList, HashSet, HashMap) for managing groups of objects.

18. Explain the difference between ArrayList and LinkedList.

  • ArrayList: Uses a dynamic array, fast random access, slow insertions/deletions.
  • LinkedList: Uses a doubly-linked list, slower access, fast insertions/deletions.
List<String> arrayList = new ArrayList<>();
List<String> linkedList = new LinkedList<>();

19. How does HashMap work internally?

HashMap uses an array of buckets, each bucket containing a linked list or a tree. The key’s hash code determines the bucket index. Collisions are resolved by chaining (linked list) or tree (if many elements).

Map<String, Integer> map = new HashMap<>();
map.put("key", 1);

20. What is the difference between HashSet and TreeSet?

  • HashSet: Uses HashMap, no order, constant-time performance.
  • TreeSet: Uses TreeMap, maintains sorted order, log-time performance.
Set<String> hashSet = new HashSet<>();
Set<String> treeSet = new TreeSet<>();

21. What is the difference between Comparable and Comparator?

  • Comparable: Defines natural ordering within the class by implementing compareTo().
  • Comparator: Defines custom ordering outside the class by implementing compare().
class Person implements Comparable<Person> {
    private String name;

    @Override
    public int compareTo(Person other) {
        return this.name.compareTo(other.name);
    }
}

class PersonNameComparator implements Comparator<Person> {
    @Override
    public int compare(Person p1, Person p2) {
        return p1.name.compareTo(p2.name);
    }
}

22. What is the use of the Collections utility class?

The Collections class provides static methods for manipulating collections, such as sorting, searching, and shuffling.

List<String> list = new ArrayList<>(Arrays.asList("b", "c", "a"));
Collections.sort(list);  // sorts the list

23. Explain the Iterator interface.

The Iterator interface provides methods to iterate over a collection (hasNext(), next(), remove()).

List<String> list = new ArrayList<>(Arrays.asList("a", "b", "c"));
Iterator<String> iterator = list.iterator();
while (iterator.hasNext()) {
    System.out.println(iterator.next());
}

24. What is the difference between Iterator and ListIterator?

  • Iterator allows traversing elements in one direction.
  • ListIterator extends Iterator and allows bi-directional traversal and modification of elements.
List<String> list = new ArrayList<>();
ListIterator<String> listIterator = list.listIterator();

25. What is the LinkedHashMap class?

LinkedHashMap maintains a doubly-linked list of its entries, preserving insertion order or access order. It extends HashMap.

LinkedHashMap<String, Integer> linkedHashMap = new LinkedHashMap<>();
linkedHashMap.put("one", 1);

26. What is the PriorityQueue class?

PriorityQueue is a queue that orders its elements according to their natural ordering or by a specified comparator. The head of the queue is the least element.

PriorityQueue<Integer> priorityQueue = new PriorityQueue<>();
priorityQueue.add(3);
priorityQueue.add(1);
priorityQueue.add(2);
System.out.println(priorityQueue.poll());  // Outputs: 1

27. How does the ConcurrentHashMap class work?

ConcurrentHashMap allows concurrent read and write operations by dividing the map into segments and locking only the affected segment during updates.

ConcurrentHashMap<String, Integer> concurrentMap = new ConcurrentHashMap<>();
concurrentMap.put("key", 1);

28. What is the TreeMap class?

TreeMap is a NavigableMap implementation that uses a Red-Black tree. It orders its elements based on their natural ordering or by a specified comparator.

TreeMap<String, Integer> treeMap = new TreeMap<>();
treeMap.put("b", 2);
treeMap.put("a", 1);

29. What is the difference between HashMap and TreeMap?

HashMap provides constant-time performance for basic operations but does not maintain any order. TreeMap provides log-time performance and maintains its elements in sorted order.

HashMap<String, Integer> hashMap = new HashMap<>();
TreeMap<String, Integer> treeMap = new TreeMap<>();

30. How does the WeakHashMap class work?

WeakHashMap uses weak references for its keys, allowing them to be garbage-collected if there are no strong references. It is useful for implementing canonicalizing mappings.

WeakHashMap<String, Integer> weakHashMap = new WeakHashMap<>();

31. Explain the CopyOnWriteArrayList class.

CopyOnWriteArrayList is a thread-safe variant of ArrayList where all mutative operations (add, set, etc.) are implemented by making a fresh copy of the underlying array.

CopyOnWriteArrayList<String> cowList = new CopyOnWriteArrayList<>();

32. What is the Deque interface?

Deque (Double Ended Queue) is an interface that extends Queue and allows elements to be added or removed from both ends.

Deque<String> deque = new ArrayDeque<>();
deque.addFirst("first");
deque.addLast("last");

33. Explain the BlockingQueue interface.

BlockingQueue is a queue that supports operations that wait for the queue to become non-empty when retrieving and waiting for space to become available when storing. It’s useful in producer-consumer scenarios.

BlockingQueue<String> blockingQueue = new ArrayBlockingQueue<>(10);

34. What is the difference between Iterator and ListIterator?

  • Iterator allows traversing elements in one direction.
  • ListIterator extends Iterator and allows bi-directional traversal and modification of elements.
List<String> list = new ArrayList<>();
ListIterator<String> listIterator = list.listIterator();

Concurrency and Multithreading

35. What is a Thread in Java?

A Thread is a lightweight process that can execute code concurrently with other threads within the same application.

Thread thread = new Thread(() -> System.out.println("Hello from a thread"));
thread.start();

36. What is the Runnable interface?

Runnable represents a task that can be executed by a thread. It has a single method run().

Runnable task = () -> System.out.println("Task is running");
Thread thread = new Thread(task);
thread.start();

37. What is the Callable interface?

Callable is similar to Runnable but can return a result and throw a checked exception.

Callable<Integer> task = () -> 123;

38. Explain synchronized methods and blocks.

Synchronization ensures that only one thread can execute a block of code at a time, preventing data inconsistency.

public synchronized void synchronizedMethod() {
    // synchronized code
}

public void method() {
    synchronized(this) {
        // synchronized block
    }
}

39. What are thread states in Java?

A thread can be in one of several states: NEW, RUNNABLE, BLOCKED, WAITING, TIMED_WAITING, and TERMINATED

40. What is the ExecutorService?

ExecutorService is a high-level replacement for working with threads directly. It manages a pool of worker threads, allowing you to submit tasks for execution.

ExecutorService executor = Executors.newFixedThreadPool(10);
executor.submit(() -> System.out.println("Task executed"));
executor.shutdown();

41. What is the difference between submit() and execute() methods in ExecutorService?

  • execute(): Executes a Runnable task but does not return a result.
  • submit(): Submits a Runnable or Callable task and returns a Future representing the task’s result.
ExecutorService executor = Executors.newFixedThreadPool(1);
executor.execute(() -> System.out.println("Runnable executed"));
Future<Integer> future = executor.submit(() -> 123);

42. What is a CountDownLatch?

CountDownLatch is a synchronization aid that allows one or more threads to wait until a set of operations in other threads completes.

CountDownLatch latch = new CountDownLatch(3);

Runnable task = () -> {
    System.out.println("Task completed");
    latch.countDown();
};

new Thread(task).start();
latch.await();  // Main thread waits until the count reaches zero

43. What is a CyclicBarrier?

CyclicBarrier is a synchronization aid that allows a set of threads to all wait for each other to reach a common barrier point.

CyclicBarrier barrier = new CyclicBarrier(3, () -> System.out.println("All tasks completed"));

Runnable task = () -> {
    System.out.println("Task executed");
    barrier.await();
};

new Thread(task).start();

44. Explain ReentrantLock and its usage.

ReentrantLock is a mutual exclusion lock with the same basic behavior as the implicit monitors accessed using synchronized blocks but with extended capabilities. It allows for more flexible locking operations and is useful in advanced concurrency scenarios.

ReentrantLock lock = new ReentrantLock();
lock.lock();  // Acquires the lock
try {
    // Critical section
} finally {
    lock.unlock();  // Releases the lock
}

45. What is a Semaphore?

Semaphore is a synchronization primitive that restricts the number of threads that can access a resource concurrently. It maintains a set of permits to control access.

Java Interview Questions and Answers

46. What is a BlockingQueue?

BlockingQueue is a queue that supports operations that wait for the queue to become non-empty when retrieving and wait for space to become available when storing. It’s useful in producer-consumer scenario

BlockingQueue<String> blockingQueue = new ArrayBlockingQueue<>(10);

47. Explain the ThreadLocal class.

ThreadLocal provides thread-local variables, allowing each thread to have its own independently initialized instance of the variable. It’s typically used to store per-thread context or avoid synchronization.

private static final ThreadLocal<Integer> threadId = ThreadLocal.withInitial(() -> Thread.currentThread().getId());

public static int getThreadId() {
    return threadId.get();
}

48. What is the difference between start() and run() methods of the Thread class?

  • start(): Creates a new thread and starts its execution. It calls the run() method internally.
  • run(): Entry point for the thread’s execution. It should be overridden to define the task to be performed by the thread.
Thread thread = new Thread(() -> System.out.println("Hello from a thread"));
thread.start();  // Calls run() internally

49. What is a Future in Java concurrency?

Future represents the result of an asynchronous computation. It provides methods to check if the computation is complete, retrieve the result, or cancel the task.

ExecutorService executor = Executors.newFixedThreadPool(1);
Future<Integer> future = executor.submit(() -> 123);
Integer result = future.get();  // Waits for the computation to complete and retrieves the result

50. What is the CompletableFuture class?

CompletableFuture is a Future that may be explicitly completed (setting its value and status), enabling further control over the asynchronous computation.

CompletableFuture<String> future = CompletableFuture.supplyAsync(() -> "Hello");
future.thenApply(s -> s + " World").thenAccept(System.out::println);

Record Classes in Java 17

In Java, we often create multiple classes, including functional classes such as service or utility classes that perform specific tasks. Additionally, we create classes solely for the purpose of storing or carrying data, a practice demonstrated by the use of record classes in Java 17.

For example:

public class Sample {
   private final int id = 10;
   private final String name = "Pavan";
}

When to use Record Classes in Java

When our object is immutable and we don’t intend to change its data, we create such objects primarily for data storage. Let’s explore how to create such a class in Java.

class Student {
    private final int id;
    private final String name;
    private final String college;

    public Student(int id, String name, String college) {
        this.id = id;
        this.name = name;
        this.college = college;
    }

    public int getId() {
        return id;
    }

    public String getName() {
        return name;
    }

    public String getCollege() {
        return college;
    }

    @Override
    public String toString() {
        return "Student{" +
                "id=" + id +
                ", name='" + name + '\'' +
                ", college='" + college + '\'' +
                '}';
    }

    @Override
    public boolean equals(Object o) {
        if (this == o) return true;
        if (o == null || getClass() != o.getClass()) return false;
        Student student = (Student) o;
        return id == student.id && Objects.equals(name, student.name)
                          && Objects.equals(college, student.college);
    }

    @Override
    public int hashCode() {
        return Objects.hash(id, name, college);
    }
}

public class RecordTest {

    public static void main(String[] args) {
        Student s1 = new Student(1, "Pavan", "IIIT");
        Student s2 = new Student(2, "Sachin", "Jntu");
        Student s3 = new Student(2, "Sachin", "Jntu");

        System.out.println(s1.getName());
        System.out.println(s1);
        System.out.println(s1.equals(s2));
        System.out.println(s2.equals(s3)); //true
    }
}

Ouput:

Record Classes in Java 17

In this code, we’ve created a Student class to represent student data, ensuring immutability by making fields id, name, and college final. Additionally, we’ve overridden toString(), equals(), and hashCode() methods for better readability and correct comparison of objects. Finally, we’ve tested the class in RecordTest class by creating instances of Student and performing some operations like printing details and checking for equality.

In Java 17, with the introduction of the records feature, the Student class can be replaced with a record class. It would look like this:

record Student (int id, String name, String college){}

public class RecordTest {

    public static void main(String[] args) {
        Student s1 = new Student(1, "Pavan", "IIIT");
        Student s2 = new Student(2, "Sachin", "Jntu");
        Student s3 = new Student(2, "Sachin", "Jntu");

        //we don't have get method in records, 
        //we can acces name like below.
        System.out.println(s1.name());
        System.out.println(s1);
        System.out.println(s1.equals(s2));
        System.out.println(s2.equals(s3)); //true

    }
}

Output:

Record Classes Java 17
  1. Parameterized Constructors: Record classes internally define parameterized constructors. All variables within a record class are private and final by default, reflecting the immutable nature of records.
  2. Equals() Method Implementation: By default, a record class implements the equals() method, ensuring proper equality comparison between instances.
  3. Automatic toString() Method: The toString() method is automatically defined for record instances, facilitating better string representation.
  4. No Default Constructor: It’s important to note that record classes do not have a default constructor. Attempting to instantiate a record class without parameters, like Student s = new Student();, would result in an error.
  5. Inheritance and Interfaces: Record classes cannot extend any other class because they implicitly extend the Record class. However, they can implement interfaces.
  6. Additional Methods: Methods can be added to record classes. Unlike traditional classes, record classes do not require getter and setter methods for accessing variables. Instead, variables are accessed using the syntax objectName.varName(). For example: s.name().

Java 21 Features with Examples

Latest Posts:

For additional information on Java 17 features, please visit

Spring WebFlux Flux Tutorial Examples

Discover the capabilities of Spring WebFlux Flux with this comprehensive tutorial. Gain insights into creating, manipulating, and transforming Flux streams effectively using practical examples. Develop proficiency in asynchronous, non-blocking programming principles and elevate your Spring application development expertise.

Flux: It’s a reactive stream in Spring WebFlux that can emit 0 or N items over time. It represents a sequence of data elements and is commonly used for handling streams of data that may contain multiple elements or continuous data flows.

Example:

Flux<User> fluxUsers = Flux.just(Arrays.asList(new User("John"),new User("Alice"),new User("Bob")));

Set up a Spring Boot project using Spring Initializr or any IDE of your choice, and make sure to include the following dependency:

<dependency>
    <groupId>org.springframework.boot</groupId>
    <artifactId>spring-boot-starter-webflux</artifactId>
</dependency>

Example: Create FluxService.java class

This class, FluxService, offers various methods illustrating different functionalities of Flux streams.

  • getFlux(): Generates a Flux stream with predefined strings, serving as static data or example data.
  • getFluxList(): Converts collections of User objects into reactive streams, enabling integration with reactive programming.
  • filterFlux(): Filters elements in the Flux stream based on specific conditions, enabling selective processing.
  • flatMapExample(): Demonstrates asynchronous processing of each element in the Flux stream using flatMap.
  • tranformFluxExample(): Illustrates Flux transformation using the transform operator, promoting modularity and maintainability.
  • defaultIfEmptyExample(String str): Handles empty Flux streams gracefully by providing a default value based on a provided condition.
  • getBlankFlux(): Returns an empty Flux stream, useful as a placeholder or starting point for further processing.
@Service
public class FluxService {

    public Flux<String> getFlux() {
        return Flux.just("First", "Second", "Third", "Fourth", "Fifth");
    }

    public Flux<User> getFluxList() {
        return Flux.fromIterable(Arrays.asList(new User("Pavan", "pavan@123"),
                        new User("Kiran", "kiran@123")))
                .cast(User.class);
    }
    
    public Flux<String> filterFlux() {
        return getFlux().filter(data -> data.equals("CCC"));
    }

    public Flux<String> flatMapExample() {
        return getFlux().flatMap(data -> Flux.just(data)).delayElements(Duration.ofMillis(3000));
    }

    /*
        while you can achieve similar results without transform by chaining operators directly on the Flux,
        transform provides a cleaner, more modular approach to defining and applying Flux transformations,
        promoting code reuse, readability, and maintainability.
     */

    public void tranformFluxExample() {
        Flux<Integer> originalFlux = Flux.range(1, 10);
        //without transform method
        Flux<Integer> integerFlux = originalFlux.map(i -> i * 2).filter(i -> i % 3 != 0).publishOn(Schedulers.parallel());
        integerFlux.subscribe(data -> System.out.println(data)); //2 4 6 8 10 14 16 20

        //with transform method.
        Flux<Integer> transformedFlux = originalFlux.transform(flux -> {
            return flux.map(i -> i * 2).filter(i -> i % 3 != 0).subscribeOn(Schedulers.parallel());
        });
        transformedFlux.subscribe(data -> System.out.println(data));
    }

    public Flux<String> defaultIfEmptyExample(String str) {
        return getFlux().filter(data -> data.contains(str)).defaultIfEmpty("Doesn't contians: " + str);
    }

    public Flux<Object> getBlankFlux() {
        return Flux.empty();
    }

}

Test the functionality of the FluxService class by running the following test methods for Spring WebFlux Flux.

@SpringBootTest
public class FluxServiceTest {

    @Autowired
    private FluxService fluxService;

    @Test
    void testFlux() {
        fluxService.getFlux().subscribe(data -> {
            System.out.println(data);
        });
    }

    @Test
    void testGetFluxList() {
        fluxService.getFluxList().subscribe(data -> {
            System.out.println(data);
        });
    }

    @Test
    void testFilter() {
        Disposable subscribe = fluxService.filterFlux().subscribe(System.out::println);
        System.out.println("filtered text: " + subscribe); // CCC
    }

    @Test
    void testFlatMap() {
        fluxService.flatMapExample().subscribe(data -> {
            System.out.println("FlatMap: " + data);
        });
    }

    @Test
    void tranformFluxExample() {
        fluxService.tranformFluxExample();
    }

    @Test
    void ifExample() {
        Flux<String> flux = fluxService.defaultIfEmptyExample("Third");
        flux.subscribe(data->{
            System.out.println("data: "+data);
        });
        StepVerifier.create(flux).expectNext("Third").verifyComplete();
    }

    @Test
    void getBlankFlux() {
        Flux<Object> blankFlux = fluxService.getBlankFlux();
        StepVerifier.create(blankFlux).expectNext().verifyComplete();
    }
}

Output

Spring WebFlux Flux

Conclusion

In short, this article gave a hands-on look at Spring WebFlux Flux, showing how it works with easy examples. By getting a grasp of Flux’s role in reactive programming and trying out its functions, developers can use it to make Spring apps that react quickly and handle big loads. We made sure our code was solid by testing it well, setting the stage for effective and sturdy software building.

What are Microservices?

Microservices are a contemporary method for developing software applications. They involve breaking down the application into smaller, independent, deployable, loosely connected, and collaborative services. This approach simplifies application comprehension and facilitates application delivery. It’s important to first understand monolithic architecture before transitioning to microservices.

Topics Covered in Microservices

1. What are Microservices?

    2. Spring Cloud Config Server Without Git

      3. Spring Cloud Config Client

      4. Reload Application Properties in Spring Boot

      5. Eureka Server using Spring Boot

      6. Spring Boot Eureka Discovery Client

        Top 20 Microservices Interview Questions and Answers

        Spring Webflux Mono Example

        Spring WebFlux Mono

        In Spring WebFlux, Mono is crucial for managing asynchronous data streams. Think of Mono like a reliable source that can provide either no data or just one piece of information. This makes it ideal for situations where you’re expecting either a single result or nothing at all. When we look at a practical “Spring Webflux Mono Example,” Mono’s significance becomes clearer. It shows how effectively it handles asynchronous data streams, which is essential for many real-world applications.

        Mono: It can emit 0 or 1 item. Its like CompletableFuture with 0 or 1 result. It’s commonly used when you expect a single result or no result. For example, finding an entity by its ID or saving an entity.

        Mono<User> monoUser = Mono.just(new User());
        

        Create a Spring Boot project and include the following dependency.

        <dependency>
            <groupId>org.springframework.boot</groupId>
            <artifactId>spring-boot-starter-webflux</artifactId>
        </dependency>
        

        Example 1: Using Mono with CoreSubscriber

        package com.javadzone.webflux;
        
        import org.junit.jupiter.api.Test;
        import org.reactivestreams.Subscription;
        import org.springframework.boot.test.context.SpringBootTest;
        import reactor.core.CoreSubscriber;
        import reactor.core.publisher.Mono;
        
        @SpringBootTest
        class BootWebfluxApplicationTests {
            @Test
            public void test() {
                // Creating a Mono publisher with test data
                Mono<String> monoPublisher = Mono.just("Testdata");
        
                // Subscribing to the Mono publisher
                monoPublisher.subscribe(new CoreSubscriber<String>() {
                    // Callback method invoked when subscription starts
                    @Override
                    public void onSubscribe(Subscription s) {
                        System.out.println("on subscribe....");
                        s.request(1);
                    }
        
                    // Callback method invoked when data is emitted
                    @Override
                    public void onNext(String data) {
                        System.out.println("data: " + data);
                    }
        
                    // Callback method invoked when an error occurs
                    @Override
                    public void onError(Throwable t) {
                        System.out.println("exception occured: " + t.getMessage());
                    }
        
                    // Callback method invoked when subscription is completed
                    @Override
                    public void onComplete() {
                        System.out.println("completed the implementation....");
                    }
                });
            }
        }
        

        This example demonstrates the usage of Mono with a CoreSubscriber, where we create a Mono publisher with test data and subscribe to it. We handle different callback methods such as onSubscribeonNextonError, and onComplete to manage the data stream.

        Example 2: Using Mono with various operators 

        package com.javadzone.webflux;
        
        import org.junit.jupiter.api.Test;
        import org.springframework.boot.test.context.SpringBootTest;
        import reactor.core.publisher.Flux;
        import reactor.core.publisher.Mono;
        import reactor.util.function.Tuple2;
        import reactor.util.function.Tuple4;
        
        @SpringBootTest
        public class MonoTest {
        
            @Test
            void testMono(){
                Mono<String> firstMono = Mono.just("First Mono");
                Mono<String> secondMono = Mono.just("Second Mono");
                Mono<String> thirdMono = Mono.just("Third Mono");
                Mono<String> fourthMono = Mono.just("Fourth Mono");
        
                // Subscribing to Monos and printing the data
                firstMono.subscribe(data -> {
                    System.out.println("Subscribed to firstMono: "+data);
                });
        
                secondMono.subscribe(data -> {
                    System.out.println("Subscribed to secondMono: "+ data);
                });
                
        
                // Combining Monos using zipWith and zip operators
                System.out.println("----------- zipWith() ------------ ");
                Mono<Tuple2<String, String>> tuple2Mono = firstMono.zipWith(secondMono);
                tuple2Mono.subscribe(data -> {
                    System.out.println(data.getT1());
                    System.out.println(data.getT2());
                });
                
        
                System.out.println("----------- zip() ------------ ");
                Mono<Tuple4<String, String, String, String>> zip = Mono.zip(firstMono, secondMono, thirdMono, fourthMono);
                zip.subscribe(data ->{
                    System.out.println(data.getT1());
                    System.out.println(data.getT2());
                    System.out.println(data.getT3());
                    System.out.println(data.getT4());
                });
                
        
                // Transforming Mono data using map and flatMap
                System.out.println("----------- map() ------------ ");
                Mono<String> map = firstMono.map(String::toUpperCase);
                map.subscribe(System.out:: println);
                
                
        
                System.out.println("----------- flatmap() ------------ ");
                //flatmap(): Transform the item emitted by this Mono asynchronously, 
                 //returning the value emitted by another Mono (possibly changing the value type).
                Mono<String[]> flatMapMono = firstMono.flatMap(data -> Mono.just(data.split(" ")));
                flatMapMono.subscribe(data-> {
                    for(String d: data) {
                        System.out.println(d);
                    }
                    //or
                    //Arrays.stream(data).forEach(System.out::println);
                });
                
                
        
                // Converting Mono into Flux using flatMapMany
                System.out.println("---------- flatMapMany() ------------- ");
        
                //flatMapMany(): Transform the item emitted by this Mono into a Publisher, 
                //then forward its emissions into the returned Flux.
                Flux<String> stringFlux = firstMono.flatMapMany(data -> Flux.just(data.split(" ")));
                stringFlux.subscribe(System.out::println);
        
        
        
                // Concatenating Monos using concatWith
                System.out.println("----------- concatwith() ------------ ");
        
                Flux<String> concatMono = firstMono.concatWith(secondMono);
                concatMono.subscribe(System.out::println);
            }
        
        }
        
        

        Output:

        Spring Webflux Mono Example

        This example showcases the usage of various Mono operators such as zipWith, zip, map, flatMap, flatMapMany, and concatWith. We create Monos with test data, subscribe to them, combine them using different operators, transform their data, and concatenate them. 

        Example 3: Writing Mono examples in a Controller 

        package com.javadzone.webflux.controller;
        
        import org.springframework.web.bind.annotation.GetMapping;
        import org.springframework.web.bind.annotation.RestController;
        import reactor.core.publisher.Mono;
        import reactor.core.scheduler.Schedulers;
        
        import java.time.Duration;
        
        @RestController
        public class WeatherController {
        
            @GetMapping("/getWeatherDataAsync")
            public Mono<String> getWeatherDataAsync() {
                System.out.println("Real-time Example with Mono:");
        
                Mono<String> weatherMono = fetchWeatherDataAsync(); // Fetch weather data asynchronously
                weatherMono.subscribe(weather -> System.out.println("Received weather data: " + weather));
        
                System.out.println("Continuing with other tasks...");
        
                // Sleep for 6 seconds to ensure weather data retrieval completes
                try {
                    Thread.sleep(6000);
                } catch (InterruptedException e) {
                    e.printStackTrace();
                }
                return weatherMono;
            }
        
        
            @GetMapping("getWeatherDataSync")
            public void getWeatherDataSync() {
                System.out.println("Simple Example without Mono:");
                fetchWeatherDataSync(); // Fetch weather data synchronously
                System.out.println("Continuing with other tasks...");
        
                // Sleep for 6 seconds to ensure weather data retrieval completes
                try {
                    Thread.sleep(6000);
                } catch (InterruptedException e) {
                    e.printStackTrace();
                }
            }
        
            public static Mono<String> fetchWeatherDataAsync() {
                System.out.println("Fetching weather data...");
                return Mono.delay(Duration.ofSeconds(5))  // Simulate API call delay of 5 seconds
                        .map(delay -> "Weather data: Sunny and 30°C") // Simulated weather data
                        .subscribeOn(Schedulers.boundedElastic()); // Execute on separate thread
            }
        
            public static void fetchWeatherDataSync() {
                System.out.println("Fetching weather data...");
                // Simulate API call delay of 5 seconds
                try {
                    Thread.sleep(5000);
                } catch (InterruptedException e) {
                    e.printStackTrace();
                }
                System.out.println("Weather data: Sunny and 30°C");
            }
        }
        
        

        Example 4: Real-time Use Case with Spring WebFlux Mono Example:

        Let’s consider a real-time example of fetching weather data from an external API using Mono, and then contrast it with a simple example without using Mono. 

        package com.javadzone.webflux.controller;
        
        import org.springframework.web.bind.annotation.GetMapping;
        import org.springframework.web.bind.annotation.RestController;
        import reactor.core.publisher.Mono;
        import reactor.core.scheduler.Schedulers;
        
        import java.time.Duration;
        
        @RestController
        public class WeatherController {
        
            @GetMapping("/getWeatherDataAsync")
            public Mono<String> getWeatherDataAsync() {
                System.out.println("Real-time Example with Mono:");
        
                Mono<String> weatherMono = fetchWeatherDataAsync(); // Fetch weather data asynchronously
                weatherMono.subscribe(weather -> System.out.println("Received weather data: " + weather));
        
                System.out.println("Continuing with other tasks...");
        
                // Sleep for 6 seconds to ensure weather data retrieval completes
                try {
                    Thread.sleep(6000);
                } catch (InterruptedException e) {
                    e.printStackTrace();
                }
                return weatherMono;
            }
        
        
            @GetMapping("getWeatherDataSync")
            public void getWeatherDataSync() {
                System.out.println("Simple Example without Mono:");
                fetchWeatherDataSync(); // Fetch weather data synchronously
                System.out.println("Continuing with other tasks...");
        
                // Sleep for 6 seconds to ensure weather data retrieval completes
                try {
                    Thread.sleep(6000);
                } catch (InterruptedException e) {
                    e.printStackTrace();
                }
            }
        
            public static Mono<String> fetchWeatherDataAsync() {
                System.out.println("Fetching weather data...");
                return Mono.delay(Duration.ofSeconds(5))  // Simulate API call delay of 5 seconds
                        .map(delay -> "Weather data: Sunny and 30°C") // Simulated weather data
                        .subscribeOn(Schedulers.boundedElastic()); // Execute on separate thread
            }
        
            public static void fetchWeatherDataSync() {
                System.out.println("Fetching weather data...");
                // Simulate API call delay of 5 seconds
                try {
                    Thread.sleep(5000);
                } catch (InterruptedException e) {
                    e.printStackTrace();
                }
                System.out.println("Weather data: Sunny and 30°C");
            }
        }
        
        

        When we access the synchronous endpoint at http://localhost:8080/getWeatherDataSync, the output will be displayed immediately.

        Simple Example without Mono:
        Fetching weather data...
        Weather data: Sunny and 30°C
        Continuing with other tasks...
        

        When we access the asynchronous endpoint at http://localhost:8080/getWeatherDataAsync, we will receive the weather data after other tasks have been completed.

        Real-time Example with Mono:
        Fetching weather data...
        Continuing with other tasks...
        Received weather data: Weather data: Sunny and 30°C
        

        Reactive Programming with Spring Boot WebFlux

        Understanding Reactive Programming

        Reactive programming with Spring Boot WebFlux presents a contemporary approach to handling data and events in applications. It involves the seamless management of information streams, leveraging asynchronous and non-blocking processes. This methodology significantly enhances efficiency and scalability throughout various systems.

        In contrast to traditional methods, which often relied on synchronous operations, reactive programming eliminates bottlenecks and constraints. By facilitating a fluid flow of data, applications operate more smoothly, resulting in improved responsiveness and performance.

        Synchronous and Blocking

        When a client sends a request, it’s assigned to a specific thread (let’s call it Thread1). If processing that request takes, say, 20 minutes, it holds up Thread1. During this time, if another request comes in, it will have to wait until Thread1 finishes processing the first request before it can be served. This behavior, where one request blocks the processing of others until it’s complete, is what we refer to as synchronous and blocking.

        Features of Reactive programming

        Asynchronous and Non-blocking

        In an asynchronous and non-blocking scenario, when a client sends a request, let’s say it’s picked up by Thread1. If processing that request takes, for example, 20 minutes, and another request comes in, it doesn’t wait for Thread1 to finish processing the first request. Instead, it’s handled separately, without blocking or waiting. Once the first request is complete, its response is returned. This approach allows clients to continue sending requests without waiting for each one to complete, thereby avoiding blocking. This style of operation, where requests are managed independently and without blocking, is commonly referred to as “non-blocking” or “asynchronous.”

        spring boot webflux

        Features of Reactive Programming with Spring Boot WebFlux

        Asynchronous and Non-blocking: Reactive programming focuses on handling tasks concurrently without waiting for each to finish before moving on to the next, making applications more responsive.

        Functional Style Coding: Reactive programming promotes a coding style that emphasizes functions or transformations of data streams, making code more expressive and modular.

        Data Flow as Event-Driven: In reactive programming, data flow is driven by events, meaning that actions or processing are triggered by changes or updates in data.

        Backpressure of DataStream: Reactive streams incorporate mechanisms to manage the flow of data between publishers and subscribers, allowing subscribers to control the rate at which they receive data

        Reactive Stream Specifications:

        Reactive streams follow specific rules, known as specifications, to ensure consistency across implementations and interoperability.

        1. Publisher

        The Publisher interface represents a data source in reactive streams, allowing subscribers to register and receive data.

        @FunctionalInterface 
        public static interface Publisher<T> { 
           public void subscribe(Subscriber<? super T> subscriber); 
        } 
        

        2. Subscriber

        The Subscriber interface acts as a receiver of data from publishers, providing methods to handle incoming data, errors, and completion signals.

        public static interface Subscriber<T> {
                public void onSubscribe(Subscription subscription);
                public void onNext(T item);
                public void onError(Throwable throwable);
                public void onComplete();
         }
        

        3. Subscription

        The Subscription interface enables subscribers to request data from publishers or cancel their subscription, offering methods for requesting specific items and canceling subscriptions.

        public static interface Subscription {
                public void request(long n);
                public void cancel();
        }
        

        4. Processor

        The Processor interface combines the functionality of publishers and subscribers, allowing for the transformation and processing of data streams.

        public static interface Processor<T,R> extends Subscriber<T>, Publisher<R> {
        }
        

        Pub Sub Event Flow:

        • Subscriber subscribes by calling the Publisher’s subscribe(Subscriber s) method.
        • After Subscription, the Publisher calls the Subscriber’s onSubscribe(Subscription s) method.
        • The Subscriber now possesses a Subscription object. Utilizing this object, it requests ‘n’ (number of data) from the Publisher.
        • Subsequently, the Publisher invokes the onNext(data) method ‘n’ times, providing the requested data.
        • Upon successful completion, the Publisher calls the onComplete() method.
        Reactive Programming with Spring Boot WebFlux

        Next Topic: Read- Spring Webflux Mono Example 

        Related Articles:

        ChatGPT Integration with Spring Boot

        ChatGPT Integration with Spring Boot

        1. Overview

        This guide will walk you through the process of integrating ChatGPT with Spring Boot. In many companies, ChatGPT is restricted, making it challenging to use. However, this tutorial provides a solution to seamlessly integrate ChatGPT with Spring Boot, ensuring smooth implementation without encountering any restrictions. Let’s get started with ChatGPT Integration with Spring Boot!

        2. What is Spring Boot

        Spring Boot is a framework used to build web applications. It’s kind of particular about how things are set up, offering default configurations that you can tweak to fit your needs. If you want to dive deeper into what Spring Boot is all about and its features, you can check out this detailed guide: https://javadzone.com/exploring-what-is-spring-boot-features/

        3. Create OpenAI API Key

        Sign up and create your own OpenAI API key here

        Click on “Create new secret key” and optionally give it a name. Then click on “Create secret key”. It will generate a secret key; copy it and save it somewhere safe.

        4. ChatGPT Integration with Spring Boot: Code Example

        Create a Spring Boot project using your IDE or spring initializr, and add the following dependencies:

        If you are using Maven, add the following dependencies:

        XML
        <dependency>
            <groupId>org.springframework.boot</groupId>
            <artifactId>spring-boot-starter-web</artifactId>
        </dependency>
        <dependency>
            <groupId>org.projectlombok</groupId>
            <artifactId>lombok</artifactId>
        </dependency>

        If you are using Gradle, add the following dependencies:

        Groovy
        implementation 'org.springframework.boot:spring-boot-starter-web'
        compileOnly 'org.projectlombok:lombok'
        annotationProcessor 'org.projectlombok:lombok'

        The project structure looks like this:

        ChatGPT Integration with Spring Boot.

        4.1 Create the CustomBotRequest POJO Class

        Java
        package com.chatgpt.bootchatgpt.beans;
        
        
        import lombok.AllArgsConstructor;
        import lombok.Data;
        
        import java.util.List;
        
        @Data
        @AllArgsConstructor
        public class CustomBotRequest {
            private String model;
            private List<Message> messages;
        }
        

        4.2 Create the Message POJO Class

        Java
        package com.chatgpt.bootchatgpt.beans;
        import lombok.AllArgsConstructor;
        import lombok.Data;
        
        @Data
        @AllArgsConstructor
        public class Message {
            private String role;
            private String content;
        }
        

        4.3 Create the CustomBotResponse POJO Class

        Java
        package com.chatgpt.bootchatgpt.beans;
        
        import lombok.Data;
        
        import java.util.List;
        
        @Data
        public class CustomBotResponse {
            private List<Choice> choices;
        }

        4.4 Create the Choice POJO class

        Java
        package com.chatgpt.bootchatgpt.beans;
        
        import lombok.Data;
        
        @Data
        public class Choice {
            private String index;
            private Message message;
        }

        4.5 Create the RestTemplateConfiguration class

        Java
        package com.chatgpt.bootchatgpt.configs;
        
        import org.springframework.beans.factory.annotation.Value;
        import org.springframework.context.annotation.Bean;
        import org.springframework.context.annotation.Configuration;
        import org.springframework.web.client.RestTemplate;
        
        @Configuration
        public class RestTemplateConfiguration {
        
            @Value("${openai.api.key}")
            private String openApiKey;
            
            @Bean
            public RestTemplate restTemplate(){
                RestTemplate restTemplate = new RestTemplate();
                restTemplate.getInterceptors().add((request, body, execution) -> {
                    request.getHeaders().add("Authorization", "Bearer "+openApiKey);
                    return execution.execute(request,body);
                });
                return restTemplate;
            }
        }

        4.6 Create the CustomBotController class

        Java
        package com.chatgpt.bootchatgpt.controller;
        
        import com.chatgpt.bootchatgpt.beans.CustomBotRequest;
        import com.chatgpt.bootchatgpt.beans.CustomBotResponse;
        import com.chatgpt.bootchatgpt.beans.Message;
        import org.springframework.beans.factory.annotation.Autowired;
        import org.springframework.beans.factory.annotation.Value;
        import org.springframework.http.HttpStatus;
        import org.springframework.http.ResponseEntity;
        import org.springframework.web.bind.annotation.GetMapping;
        import org.springframework.web.bind.annotation.RequestMapping;
        import org.springframework.web.bind.annotation.RequestParam;
        import org.springframework.web.bind.annotation.RestController;
        import org.springframework.web.client.RestTemplate;
        
        import java.util.Collections;
        
        @RestController
        @RequestMapping("/api")
        public class CustomBotController {
        
            @Value("${openai.model}")
            private String model;
            
            @Value("${openai.api.url}")
            private String url;
        
            @Autowired
            private RestTemplate restTemplate;
        
            @GetMapping("/chat")
            public ResponseEntity<String> getResponse(@RequestParam("query") String query) {
                Message message = new Message("user", query);
                CustomBotRequest customBotRequest = new CustomBotRequest(model, Collections.singletonList(message));
                CustomBotResponse customBotResponse = restTemplate.postForObject(url, customBotRequest, CustomBotResponse.class);
                
                if(customBotResponse == null || customBotResponse.getChoices() == null || customBotResponse.getChoices().isEmpty()){
                    return ResponseEntity.status(HttpStatus.NO_CONTENT).body("No response from ChatGPT");
                }
                
                String botResponse = customBotResponse.getChoices().get(0).getMessage().getContent();
                return ResponseEntity.ok(botResponse);
            }
        }

        4.7 Add the following properties. Include your secret key generated from OpenAI API in step 3

        PowerShell
        openai.model=gpt-3.5-turbo
        openai.api.key=sk56RkTP5gF9a4L9bcBya34477W2dgdf7cvsdf6d0s9dfgkk
        openai.api.url=https://api.openai.com/v1/chat/completions

        5. Run The Application

        Access this endpoint via Postman or your browser: http://localhost:8080/api/chat?query=Java 8 features list. The provided query is “Java 8 features list,” but feel free to modify it as needed. You will see the result like below.

        Conclusion

        In summary, this guide has shown you how to bring ChatGPT and Spring Boot together, opening up exciting possibilities for your web applications. By following these steps, you can seamlessly integrate ChatGPT into your Spring Boot projects, enhancing user interactions and making your applications smarter. So, why wait? Dive in and discover the power of ChatGPT integration with Spring Boot today!

        Spring Boot MongoDB CRUD Application Example

        Spring-Boot-MongoDB-CRUD-Application-Example

        Spring Boot MongoDB CRUD Application Example: A Step-by-Step Guide

        Step 1: Setting Up the Project

        Start by creating a new Spring Boot project using Spring Initializr. Add the spring-boot-starter-data-mongodb dependency to enable MongoDB integration, laying the groundwork for our focused exploration of a Spring Boot MongoDB CRUD application example.

        XML
        <dependency>
            <groupId>org.springframework.boot</groupId>
            <artifactId>spring-boot-starter-data-mongodb</artifactId>
        </dependency>
        

        Step 2: Define the Employee Model

        Create an Employee class to represent the data model. Annotate it with @Document to map it to a MongoDB collection.

        Java
        package com.crud.beans;
        
        import org.springframework.data.annotation.Id;
        import org.springframework.data.mongodb.core.mapping.Document;
        
        @Document(collection = "employees")
        public class Employee {
        
            @Id
            private String id;
            private String name;
            private int age;
        
            // Getters and setters
        }
        

        Step 3: Implement the Repository

        Create an EmployeeRepository interface that extends MongoRepository. This interface provides CRUD operations for the Employee entity.

        Java
        package com.crud.repo;
        
        import org.springframework.data.mongodb.repository.MongoRepository;
        import com.crud.beans.Employee;
        
        public interface EmployeeRepository extends MongoRepository<Employee, String> {
        
        }
        

        Step 4: Develop the Service Layer

        Create an EmployeeService class to encapsulate the business logic. Autowire the EmployeeRepository and implement methods for CRUD operations.

        Java
        package com.crud.service;
        
        import java.util.List;
        import java.util.Optional;
        import org.springframework.beans.factory.annotation.Autowired;
        import org.springframework.stereotype.Service;
        import com.crud.beans.Employee;
        import com.crud.repo.EmployeeRepository;
        
        @Service
        public class EmployeeService {
        
            @Autowired
            private EmployeeRepository employeeRepository;
            
            public List<Employee> getEmployees() {
                return employeeRepository.findAll();
            }
            
            public Employee create(Employee employee) {
                return employeeRepository.save(employee);
            }
            
            public Optional<Employee> updateEmployee(String id, Employee employee) {
                if(!employeeRepository.existsById(id)) {
                    return Optional.empty();
                }
                
                employee.setId(id);
                return Optional.of(employeeRepository.save(employee));
            }
            
            public void deleteEmployee(String id) {
                employeeRepository.deleteById(id);
            }
        }
        

        Step 5: Implement the Controller

        Create an EmployeeController class to define the RESTful API endpoints for CRUD operations.

        Java
        package com.crud.controller;
        
        import java.util.List;
        import java.util.Optional;
        import org.springframework.beans.factory.annotation.Autowired;
        import org.springframework.http.HttpStatus;
        import org.springframework.http.ResponseEntity;
        import org.springframework.web.bind.annotation.*;
        import com.crud.beans.DeleteResponse;
        import com.crud.beans.Employee;
        import com.crud.service.EmployeeService;
        
        @RestController
        @RequestMapping("/api/employees")
        public class EmployeeController {
        
            @Autowired
            private EmployeeService employeeService;
            
            @GetMapping
            public List<Employee> getEmployees() {
                return employeeService.getEmployees();
            }
            
            @PostMapping
            public ResponseEntity<Employee> createEmployee(@RequestBody Employee employee) {
                Employee createdEmployee = employeeService.create(employee);
                return ResponseEntity.status(HttpStatus.CREATED).body(createdEmployee);
            }
            
            @PutMapping("/{id}")
            public ResponseEntity<Employee> updateEmployee(@PathVariable String id, @RequestBody Employee employee) {
                return employeeService.updateEmployee(id, employee)
                        .map(ResponseEntity::ok)
                        .orElse(ResponseEntity.ok().build());
            }
            
            @DeleteMapping("/{id}")
            public ResponseEntity<DeleteResponse> deleteEmployee(@PathVariable String id) {
                employeeService.deleteEmployee(id);
                return ResponseEntity.status(HttpStatus.OK)
                        .body(new DeleteResponse("Employee Deleted Successfully", id, "Deleted Employee Name"));
            }
        }
        

        Step 6: Configure MongoDB Connection

        Set up the MongoDB connection properties in the application.properties file.

        Java
        spring.data.mongodb.host=localhost
        spring.data.mongodb.port=27017
        spring.data.mongodb.database=boot-crud

        Step 7: Test the Application

        Start the MongoDB server and run your Spring Boot application. Utilize tools like Postman or curl to send HTTP requests to the defined endpoints. Verify that the CRUD operations are functioning as expected by checking if you can retrieve, create, update, and delete employees.

        Creating a New Employee (POST)

        Endpoint: POST /api/employees

        Spring Boot MongoDB CRUD Application Example

        Retrieving All Employees (GET)

        Spring Boot MongoDB CRUD Application Example step by step guide

        Updating an Employee (PUT)

        Endpoint:PUT /api/employees/{id}

        Spring Boot MongoDB CRUD Application

        Deleting an Employee (DELETE)

        Endpoint:DELETE /api/employees/{id}

        Spring Boot Mongo DB CRUD Application  Example

        Below is the snapshot of the employee collection. Please review:

        Spring Boot Mongo DB CRUD Application Example mongo collection.

        Conclusion:

        By following these steps and using Postman to interact with the Spring Boot and MongoDB applications, you can easily perform CRUD operations on employee records. This example demonstrates how to retrieve, create, update, and delete employee data in a straightforward manner.

        Feel free to customize the input data and explore additional features of the application to suit your requirements. Happy coding!

        Top 20 Microservices Interview Questions and Answers

        Top 20 Microservices Interview Questions and Answers

        Getting ready for a job interview that’s all about microservices? Well, you’re in the right place. We’ve gathered the top 20 microservices interview questions and paired them with detailed answers to help you shine in that interview room. Whether you’re a seasoned pro in the world of microservices or just starting out, these questions and answers are here to boost your confidence and knowledge. Let’s dive in and get you all set to impress your potential employers with your microservices expertise.

        Top 20 Microservices Interview Questions

        Q1) What are Microservices?

        Microservices, also known as Microservices Architecture, is a software development approach that involves constructing complex applications by assembling smaller, independent functional modules. Think of it as building a large, intricate system from smaller, self-contained building blocks.

        For instance, imagine a modern e-commerce platform. Instead of creating one monolithic application to handle everything from product listings to payments, you can use microservices. Each function, like product catalog, shopping cart, user authentication, and payment processing, becomes a separate microservice. They work together as a cohesive unit, with each microservice responsible for its specific task.

        This approach offers benefits such as flexibility, scalability, and ease of maintenance. If one microservice needs an update or experiences issues, it can be modified or fixed without affecting the entire system. It’s like having a toolkit of specialized tools that can be swapped in or out as needed, making software development more efficient and adaptable.

        Q2) What are the main features of Microservices?

        Decoupling: Modules are independent and do not rely on each other.

        Componentization: Applications are divided into small, manageable components.

        Business Capabilities: Modules correspond to specific business functions.

        Autonomy: Each module can function independently.

        Continuous Delivery(CI/CD): Frequent updates and releases are possible.

        Responsibility: Each module is responsible for its functionality.

        Decentralized Governance: Decision-making is distributed across modules.

        Agility: Adaptability and responsiveness to changes are key attributes.

        Q3) What are the key parts of Microservices?

        Microservices rely on various elements to work effectively. Some of the main components include:

        Containers, Clustering, and Orchestration: These tools help manage and organize microservices within a software environment.

        Infrastructure as Code (IaC): IaC involves using code to automate and control infrastructure setup and configuration.

        Cloud Infrastructure: Many microservices are hosted on cloud platforms, which provide the necessary computing resources.

        API Gateway: An API Gateway acts as a central entry point for various microservices, making it easier for them to communicate with each other.

        Enterprise Service Bus: This component facilitates efficient communication and integration between different microservices and applications.

        Service Delivery: Ensuring that microservices are delivered effectively to end-users and seamlessly integrated into the software system.

        These components work together to support the operation of microservices and enhance the scalability and flexibility of a software system.

        Q4) Explain the working of microservices?

        Microservices Architecture:

        Top 20 Microservices Interview Questions and Answers

        Client Request: The process begins when a client, such as a web browser or mobile app, sends a request to the application. This request could be anything from fetching data to performing specific tasks.

        API Gateway: The client’s request is initially intercepted by the API Gateway, acting as the application’s point of entry. Think of it as the first stop for incoming requests.

        Service Discovery (Eureka Server): To find the right microservice to fulfill the request, the API Gateway checks in with the Eureka Server. This server plays a crucial role by maintaining a directory of where different microservices are located.

        Routing: With information from the Eureka Server in hand, the API Gateway directs the request to the specific microservice that’s best suited to handle it. This ensures that each request goes to the right place.

        Circuit Breaker: Inside the microservice, a Circuit Breaker is at work, keeping an eye on the request and the microservice’s performance. If the microservice faces issues or becomes unresponsive, the Circuit Breaker can temporarily halt additional requests to prevent further problems.

        Microservice Handling: The designated microservice takes the reins, processing the client’s request, and interacting with databases or other services as needed.

        Response Generation: After processing the request, the microservice generates a response. This response might include requested data, an acknowledgment, or the results of the task requested by the client.

        Ribbon Load Balancing: On the client’s side, Ribbon comes into play. It’s responsible for balancing the load when multiple instances of the microservice are available. Ribbon ensures that the client connects to the most responsive instance, enhancing performance and providing redundancy.

        API Gateway Response: The response generated by the microservice is sent back to the API Gateway.

        Client Response: Finally, the API Gateway returns the response to the client. The client then receives and displays this response. It could be the requested information or the outcome of a task, allowing the user to interact with the application seamlessly.

        Q5) What are the differences between Monolithic, SOA and Microservices Architecture?

        Architecture TypeDescription
        Monolithic ArchitectureA massive container where all software components are tightly bundled, creating one large system with a single code base.
        Service-Oriented Architecture (SOA)A group of services that interact and communicate with each other. Communication can range from simple data exchange to multiple services coordinating activities.
        Microservices ArchitectureAn application structured as a cluster of small, autonomous services focused on specific business domains. These services can be deployed independently, are scalable, and communicate using standard protocols.
        Comparison of Architectural Approaches

        Q6: What is  Service Orchestration and Service Choreography in Microservices?

        Service orchestration and service choreography are two different approaches for managing the dance of microservices. Here’s how they groove:

        • Service Orchestration: This is like having a conductor in an orchestra. There’s a central component that’s the boss, controlling and coordinating the movements of all microservices. It’s a tightly organized performance with everything in sync.
        • Service Choreography: Think of this as a group of dancers who know the steps and dance together without a choreographer. In service choreography, microservices collaborate directly with each other, no central controller in sight. It’s a bit more like a jam session, where each service has its own rhythm.
        • Comparison: Service orchestration offers a more controlled and well-coordinated dance, where every step is planned. Service choreography, on the other hand, is like a dance-off where individual services have the freedom to show their moves. It’s more flexible, but it can get a bit wild.

        Q7) What is the role of an actuator in Spring Boot?

        In Spring Boot, an actuator is a project that offers RESTful web services to access the real-time status and information about an application running in a production environment. It allows you to monitor and manage the usage of the application without the need for extensive coding or manual configuration. Actuators provide valuable insights into the application’s health, metrics, and various operational aspects, making it easier to maintain and troubleshoot applications in a production environment.

        Q8) How to Customize Default Properties in Spring Boot Projects?

        Customizing default properties in a Spring Boot project, including database properties, is achieved by specifying these settings in the application.properties file. Here’s an example that explains this concept without plagiarism:

        Example: Database Configuration

        Imagine you have a Spring Boot application that connects to a database. To tailor the database connection to your needs, you can define the following properties in the application.properties file:

        Bash
        spring.datasource.url = jdbc:mysql://localhost:3306/bd-name
        spring.datasource.username = user-name
        spring.datasource.password = password

        By setting these properties in the application.properties file, you can easily adjust the database configuration of your Spring Boot application. This flexibility allows you to adapt your project to different database environments or specific requirements without the need for extensive code modifications

        Q9) What is Cohesion and Coupling in Software Design?

        Cohesion refers to the relationship between the parts or elements within a module. It measures how well these elements work together to serve a common purpose. When a module exhibits high cohesion, its elements collaborate efficiently to perform a specific function, and they do so without requiring constant communication with other modules. In essence, high cohesion signifies that a module is finely tuned for a specific task, which, in turn, enhances the overall functionality of that module.

        For example, consider a module in a word-processing application that handles text formatting. It exhibits high cohesion by focusing solely on tasks like font styling, paragraph alignment, and spacing adjustments without being entangled in unrelated tasks.

        Coupling signifies the relationship between different software modules, like Modules A and B. It assesses how much one module relies on or interacts with another. Coupling can be categorized into three main types: highly coupled (high dependency), loosely coupled, and uncoupled. The most favorable form of coupling is loose coupling, which is often achieved through well-defined interfaces. In a loosely coupled system, modules maintain a degree of independence and can be modified or replaced with minimal disruption to other modules.

        For instance, think of an e-commerce application where the product catalog module and the shopping cart module are loosely coupled. They communicate through a clear interface, allowing each to function independently. This facilitates future changes or upgrades to either module without causing significant disturbances in the overall system.

        In summary, cohesion and coupling are fundamental principles in software design that influence how modules are organized and interact within a software system. High cohesion and loose coupling are typically sought after because they lead to more efficient, maintainable, and adaptable software systems.

        Q10) What Defines Microservice Design?

        Microservice design is guided by a set of core principles that distinguish it from traditional monolithic architectures:

        • Business-Centric Approach: Microservices are organized around specific business capabilities or functions. Each microservice is responsible for a well-defined task, ensuring alignment with the organization’s core business objectives.
        • Product-Oriented Perspective: Unlike traditional projects, microservices are treated as ongoing products. They undergo continuous development, maintenance, and improvement to remain adaptable to evolving business needs.
        • Effective Messaging Frameworks: Microservices rely on robust messaging frameworks to facilitate seamless communication. These frameworks enable microservices to exchange data and coordinate tasks efficiently.
        • Decentralized Governance: Microservices advocate decentralized governance, granting autonomy to each microservice team. This decentralization accelerates development and decision-making processes.
        • Distributed Data Management: Data management in microservices is typically decentralized, with each microservice managing its data store. This approach fosters data isolation, scalability, and independence.
        • Automation-Driven Infrastructure: Automation plays a pivotal role in microservices. Infrastructure provisioning, scaling, and maintenance are automated, reducing manual effort and minimizing downtime.
        • Resilience as a Design Principle: Microservices are designed with the expectation of failures. Consequently, they prioritize resilience. When one microservice encounters issues, it should not disrupt the entire system, ensuring uninterrupted service availability.

        These principles collectively contribute to the agility, scalability, and fault tolerance that make microservices a popular choice in modern software development. They reflect a strategic shift towards building software systems that are more responsive to the dynamic demands of today’s businesses.

        Q11: What’s the Purpose of Spring Cloud Config and How Does It Work?

        let’s simplify this for a clear understanding:

        Purpose: Spring Cloud Config is like the command center for configuration properties in microservices. Its main job is to make sure all the configurations are well-organized, consistent, and easy to access.

        How It Works:

        • Version-Controlled Repository: All your configuration info is stored in a special place that keeps a history of changes. Think of it as a well-organized filing cabinet for configurations.
        • Configuration Server: Inside Spring Cloud Config, there’s a designated server that takes care of your configuration data. It’s like the trustworthy guard of your valuable information.
        • Dynamic and Centralized: The cool part is that microservices can request their configuration details from this server on the spot, while they’re running. This means any changes or updates to the configurations are instantly shared with all the microservices. It’s like having a super-efficient communication channel for all your configurations.

        Q12) How Do Independent Microservices Communicate?

        Picture a world of microservices, each minding its own business. Yet, they need to talk to each other, and they do it quite ingeniously:

        • HTTP/REST with JSON or Binary Protocols: It’s like sending letters or emails. Microservices make requests to others, and they respond. They speak a common language, often in formats like JSON or more compact binary codes. This works well when one service needs specific information or tasks from another.
        • Websockets for Streaming: For those real-time conversations, microservices use Websockets. Think of it as talking on the phone, but not just in words – they can share data continuously. It’s ideal for things like live chats, streaming updates, or interactive applications.
        • Message Brokers: These are like message relay stations. Services send messages to a central point (the broker), and it ensures messages get to the right recipients. There are different types of brokers, each specialized for specific communication scenarios. Apache Kafka, for instance, is like the express courier for high-throughput data.
        • Backend as a Service (BaaS): This is the “hands-free” option. Microservices can use platforms like Space Cloud, which handle a bunch of behind-the-scenes tasks. It’s like hiring someone to take care of your chores. BaaS platforms can manage databases, handle authentication, and even run serverless functions.

        In this interconnected world, microservices pick the best way to chat based on what they need to say. It’s all about keeping them independent yet harmoniously communicating in the vast landscape of microservices.

        Q13) What is Domain-Driven Design (DDD)?

        Domain-Driven Design, often abbreviated as DDD, is an approach to software development that centers on a few key principles:

        • Focus on the Core Domain and Domain Logic: DDD places a strong emphasis on understanding and honing in on the most critical and valuable aspects of a project, which is often referred to as the “core domain.” This is where the primary business or problem-solving logic resides. DDD aims to ensure that the software accurately represents and serves this core domain.
        • Analyze Domain Models for Complex Designs: DDD involves in-depth analysis of the domain models. By doing so, it seeks to uncover intricate designs and structures within the domain that may not be immediately apparent. This analysis helps in creating a software design that faithfully mirrors the complexity and nuances of the real-world domain.
        • Continuous Collaboration with Domain Experts: DDD encourages regular and close collaboration between software development teams and domain experts. These domain experts are individuals who possess in-depth knowledge of the problem domain (the industry or field in which the software will be used). By working together, they refine the application model, ensuring it effectively addresses emerging issues and aligns with the evolving domain requirements.

        In essence, Domain-Driven Design is a holistic approach that promotes a deep understanding of the problem domain, leading to software solutions that are more accurate, relevant, and adaptable to the ever-changing needs of the domain they serve.

        Q14). What is OAuth?

        Think of OAuth as the key to the world of one-click logins. It’s what allows you to use your Facebook or Google account to access various websites and apps without creating new usernames and passwords.

        Here’s the magic:

        • No More New Accounts: Imagine you stumble upon a cool new app, and it asks you to sign up. With OAuth, you can skip that part. Instead, you click “Log in with Facebook” or another platform you trust.
        • Sharing Just What’s Needed: You don’t have to share your Facebook password with the app. Instead, the app asks Facebook, “Is this person who they claim to be?” Facebook says, “Yep, it’s them!” and you’re in.
        • Secure and Convenient: OAuth makes logging in more secure because you’re not giving out your password to every app you use. It’s like showing your ID card to get into a party without revealing all your personal info.

        So, next time you see the option to log in with Google or some other platform, you’ll know that OAuth is working behind the scenes to make your life simpler and safer on the internet.

         Q15) Why Reports and Dashboards Matter in Microservices?

        Reports and dashboards play a pivotal role in the world of microservices for several key reasons:

        • Resource Roadmap: Imagine reports and dashboards as your detailed map of the microservices landscape. They show you which microservices handle specific tasks and resources. It’s like having a GPS for your system’s functionality.
        • Change Confidence: When changes happen (and they do in software), reports and dashboards step in as your security net. They tell you exactly which services might be impacted. Think of it as a warning system that prevents surprises.
        • Instant Documentation: Forget digging through files or searching for the latest documents. Reports and dashboards are your instant, always-up-to-date documentation. Need info on a specific service? It’s just a click away.
        • Version Control: In the microservices world, keeping tabs on different component versions is a bit like tracking your app updates. Reports and dashboards help you stay on top of what’s running where and if any part needs an upgrade.
        • Quality Check: They’re your quality control inspectors. They help you assess how mature and compliant your services are. It’s like checking the quality of ingredients before cooking a meal – you want everything to be up to the mark.

        So, reports and dashboards are your trustworthy companions, helping you navigate the intricacies of microservices, ensuring you’re in control and making informed decisions in this dynamic software world.

        Q16) What are Reactive Extensions in Microservices?

        Reactive Extensions, or Rx, is a design approach within microservices that coordinates multiple service calls and combines their results into a single response. These calls can be blocking or non-blocking, synchronous or asynchronous. In the context of distributed systems, Rx operates in a manner distinct from traditional workflows.

        Q17) Types of Tests Commonly Used in Microservices?

        Testing in the world of microservices can be quite intricate due to the interplay of multiple services. To manage this complexity, tests are categorized based on their level of focus:

        • Unit Tests: These tests zoom in on the smallest building blocks of microservices – individual functions or methods. They validate that each function performs as expected in isolation.
        • Component Tests: At this level, multiple functions or components within a single microservice are tested together. Component tests ensure that the internal workings of a microservice function harmoniously.
        • Integration Tests: Integration tests go further by examining how different microservices collaborate. They validate that when multiple microservices interact, the system behaves as anticipated.
        • Contract Tests: These tests check the agreements or contracts between microservices. They ensure that the communication between services adheres to predefined standards, preventing unintended disruptions.
        • End-to-End (E2E) Tests: E2E tests assess the entire application’s functionality, simulating user journeys. They validate that all microservices work cohesively to provide the desired user experience.
        • Load and Performance Tests: These tests evaluate how microservices perform under varying loads. They help identify bottlenecks and performance issues to ensure the system can handle real-world demands.
        • Security Tests: Security tests scrutinize the microservices for vulnerabilities and ensure data protection measures are effective.
        • Usability Tests: Usability tests assess the user-friendliness and accessibility of the microservices. They focus on the overall user experience.

        Q18) What are Containers in Microservices?

        Containers are a powerful solution for managing microservices. They excel in efficiently allocating and sharing resources, making them the preferred choice for developing and deploying microservice-based applications. Here’s the essence of containers in the world of microservices:

        • Resource Allocation: Containers excel in efficiently distributing computing resources. They ensure each microservice has the right amount of CPU, memory, and storage to function optimally.
        • Isolation: Containers create a secure boundary for each microservice. They operate independently, preventing conflicts or interference between services, which is crucial in microservices architecture.
        • Portability: Containers package microservices and their dependencies into a single, portable unit. This means you can develop a microservice on your local machine and deploy it in various environments, ensuring consistency.
        • Efficient Scaling: Containers make scaling microservices a breeze. You can replicate and deploy containers as needed, responding quickly to changing workloads.
        • Simplified Management: Container orchestration platforms like Kubernetes provide centralized management for deploying, scaling, and monitoring microservices in a containerized environment.

        Q19) The Core Role of Docker in Microservices?

        • Containerizing Applications: Docker acts as a container environment where you can place your microservices. It not only packages the microservice itself but also all the necessary components it relies on to function seamlessly. These bundled packages are aptly called “Docker containers.”
        • Streamlined Management: With Docker containers, managing microservices becomes straightforward. You can effortlessly start, stop, or move them around, akin to organizing neatly labeled boxes for easy transport.
        • Resource Efficiency: Docker ensures that each microservice receives the appropriate amount of computing resources, like CPU and memory. This ensures that they operate efficiently without monopolizing or underutilizing system resources.
        • Consistency: Docker fosters uniformity across different stages, such as development, testing, and production. No longer will you hear the excuse, “It worked on my machine.” Docker guarantees consistency, a valuable asset in the world of microservices.

        Q20): What are tools used to aggregate microservices log files?

        In the world of microservices, managing log files can be a bit of a juggling act. To simplify this essential task, here are some reliable tools at your disposal:

        • ELK Stack (Elasticsearch, Logstash, Kibana): The ELK Stack is like a well-coordinated trio of tools designed to handle your log data.
          • Logstash: Think of Logstash as your personal data curator. It’s responsible for collecting and organizing log information.
          • Elasticsearch: Elasticsearch acts as your dedicated log archive. It meticulously organizes and stores all your log entries.
          • Kibana: Kibana takes on the role of your trusted detective, armed with a magnifying glass. It allows you to visualize and thoroughly inspect your logs. Whether you’re searching for trends, anomalies, or patterns, Kibana has got you covered.
        • Splunk: Splunk is the heavyweight champion in the world of log management.
          • This commercial tool comes packed with a wide range of features. It not only excels at log aggregation but also offers powerful searching, monitoring, and analysis capabilities.
          • It provides real-time alerts, dynamic dashboards, and even harnesses the might of machine learning for in-depth log data analysis.

        Spring Boot Apache Kafka Tutorial: Practical Example

        Spring-boot-apache-kafka

        Introduction:

        When we need to reuse the logic of one application in another application, we often turn to web services or RESTful services. However, if we want to asynchronously share data from one application to another, message queues, and in particular, Spring Boot Apache Kafka, come to the rescue.

        Spring Boot Apache Kafka

        Message queues operate on a publish-subscribe (pub-sub) model, where one application acts as a publisher (sending data to the message queue), and another acts as a subscriber (receiving data from the message queue). Several message queue options are available, including JMS, IBM MQ, RabbitMQ, and Apache Kafka.

        Apache Kafka is an open-source distributed streaming platform designed to handle such scenarios.

        Kafka Cluster As Kafka is a distributed system, it functions as a cluster consisting of multiple brokers. A Kafka cluster should have a minimum of three brokers. The diagram below illustrates a Kafka cluster with three brokers:

        Apache Kafka Architecture

        Spring Boot Kafka Architecture

        Kafka Broker A Kafka broker is essentially a Kafka server. It serves as an intermediary, facilitating communication between producers (data senders) and consumers (data receivers). The following diagram depicts a Kafka broker in action:

        Kafka Broker Architecture

        Kafka Broker Architecture

        Main APIs in Spring Boot Apache Kafka

        1. Producer API: Responsible for publishing data to the message queue.
        2. Consumer API: Deals with consuming messages from the Kafka queue.
        3. Streams API: Manages continuous streams of data.
        4. Connect API: Handles connections with Kafka (used by both producers and subscribers).
        5. Admin API: Manages Kafka topics, brokers, and related configurations.

        Steps:

        Step 1: Download and Extract Kafka

        Begin by downloading Kafka from this link and extracting it to your desired location.

        Step 2: Start the ZooKeeper Server

        The ZooKeeper server provides the environment for running the Kafka server. Depending on your operating system:

        For Windows, open a command prompt, navigate to the Kafka folder, and run:

        Bash
        bin\windows\zookeeper-server-start.bat config\zookeeper.properties

        For Linux/Mac, use the following command:

        Bash
        bin/zookeeper-server-start.sh config/zookeeper.properties

        ZooKeeper runs on port 2181.

        Step 3: Start the Kafka Server

        After starting ZooKeeper, run the Kafka server with the following command for Windows:

        Bash
        bin\windows\kafka-server-start.bat config\server.properties

        For Linux/Mac, use the following command:

        Bash
        bin/kafka-server-start.sh config/server.properties

        Kafka runs on port 9092.

        Step 4: Create a Kafka Topic

        You can create a Kafka topic using two methods:

        4.1. Using Command Line:

        Open a command prompt or terminal and run the following command for Windows:

        Bash
        bin\windows\kafka-topics.bat --create --topic student-enrollments --bootstrap-server localhost:9092

        Replace “student-enrollments” with your desired topic name.

        For Linux/Mac:

        Bash
        bin/kafka-topics.sh --create --topic student-enrollments --bootstrap-server localhost:9092

        4.2. From the Spring Boot Application (Kafka Producer):

        For this, we’ll create a Kafka producer application that will programmatically create a topic.

        Step 5: Setting Up a Spring Boot Kafka Producer

        Step 5.1: Add Dependencies

        In your Spring Boot project, add the following dependencies to your pom.xml or equivalent configuration:

        XML
        <dependency>
            <groupId>org.springframework.boot</groupId>
            <artifactId>spring-boot-starter-web</artifactId>
        </dependency>
        <dependency>
            <groupId>org.springframework.kafka</groupId>
            <artifactId>spring-kafka</artifactId>
        </dependency>
        <dependency>
            <groupId>org.springframework.boot</groupId>
            <artifactId>spring-boot-starter-aop</artifactId>
        </dependency>
        <dependency>
            <groupId>org.springframework.retry</groupId>
            <artifactId>spring-retry</artifactId>
        </dependency>
        

        Step 5.2: Configure Kafka Producer Properties

        Add the following Kafka producer properties to your application.properties or application.yml:

        Java
        # Producer Configurations
        spring.kafka.producer.bootstrap-servers=localhost:9092
        spring.kafka.producer.key-serializer=org.apache.kafka.common.serialization.StringSerializer
        spring.kafka.producer.value-serializer=org.apache.kafka.common.serialization.StringSerializer

        Step 5.3: Enable Retry

        Add the @EnableRetry annotation to your application class to enable event retrying:

        Java
        @EnableRetry
        @SpringBootApplication
        public class KafkaProducerApplication {
            public static void main(String[] args) {
                SpringApplication.run(KafkaProducerApplication.class, args);
            }
        }

        Step 5.4: Create Kafka Topics

        Configure Kafka topics in a KafkaConfig.java class:

        Java
        @Configuration
        public class KafkaConfig {
            public static final String FIRST_TOPIC = "student-enrollments";
            public static final String SECOND_TOPIC = "student-grades";
            public static final String THIRD_TOPIC = "student-achievements";
            
            @Bean
            List<NewTopic> topics() {
                List<String> topicNames = Arrays.asList(FIRST_TOPIC, SECOND_TOPIC, THIRD_TOPIC);
                return topicNames.stream()
                    .map(topicName -> TopicBuilder.name(topicName).build())
                    .collect(Collectors.toList());
            }
        }

        Step 5.5: Create a Producer Service:

        Implement a ProducerService.java to send messages:

        Java
        @Service
        public class ProducerService {
        
            @Autowired
            private KafkaTemplate<String, String> kafkaTemplate;
        
            @Retryable(maxAttempts = 3)
            public CompletableFuture<SendResult<String, String>> sendMessage(String topicName, String message) {
                return this.kafkaTemplate.send(topicName, message);
            }
        }
        

        Step 5.6: Create a Student Bean Define a Student class with appropriate getters, setters, and a constructor.

        Java
        public class Student {
        	private String name;
        	private String email;
        	
        	//accessors
        }

        Step 5.7: Create a Kafka Controller Create a controller to produce messages:

        Java
        @RestController
        public class KafkaController {
            @Autowired
            private ProducerService producerService;
            
            @PostMapping("/produce")
            public ResponseEntity<String> produce(@RequestParam String topicName, @RequestBody Student student)
                    throws InterruptedException, ExecutionException {
                String successMessage = null;
                producerService.sendMessage(topicName, "Producing Student Details: " + student);
                successMessage = String.format(
                        "Successfully produced student information to the '%s' topic. Please check the consumer.", topicName);
                return ResponseEntity.status(HttpStatus.OK).body(successMessage);
            }
        }
        

        Step 6: Spring Boot Consumer Application

        You can consume Kafka events/topics in two ways:

        Step 6.1: Using Command Line

        To consume messages using the command line for Windows, use the following command:

        Bash
        bin\windows\kafka-console-consumer.bat --topic student-enrollments --from-beginning --bootstrap-server localhost:9092

        Step 6.2: Building a Consumer Application

        To build a consumer application, follow these steps:

        Step 6.2.1: Create a Spring Boot Project Create a Spring Boot project with an application class.

        Java
        @SpringBootApplication
        public class KafkaConsumerApplication {
            public static void main(String[] args) {
                SpringApplication.run(KafkaConsumerApplication.class, args);
            }
        }

        Step 6.2.2: Create a Kafka Consumer

        Implement a Kafka consumer class to consume messages:

        Java
        @Service
        public class KafkaConsumer {
            @KafkaListener(topics = {"student-enrollments", "student-grades", "student-achievements"}, groupId = "group-1")
            public void consume(String value) {
                System.out.println("Consumed: " + value);
            }
        }

        Step 6.2.3: Configure Kafka Consumer Properties

        Configure Kafka consumer properties in application.properties or application.yml:

        Java
        server.port=8089
        spring.kafka.consumer.bootstrap-servers=localhost:9092
        spring.kafka.consumer.group-id=group-1
        spring.kafka.consumer.auto-offset-reset=earliest
        spring.kafka.consumer.key-deserializer=org.apache.kafka.common.serialization.StringDeserializer
        spring.kafka.consumer.value-deserializer=org.apache.kafka.common.serialization.StringDeserializer

        Step 6.2.4: Run Your Kafka Consumer Application

        Make sure to follow each step carefully, and don’t miss any instructions. This guide should help beginners set up and use Apache Kafka with Spring Boot effectively

        Now that you’ve set up your Kafka producer and Kafka consumer applications, it’s time to run them.

        Execute both the Producer and Consumer applications. In the Producer application, make a request to the following endpoint: http://localhost:8080/produce?topicName=student-enrollments. You will observe the corresponding output in the Consumer application and in the console when you are subscribed to the same “student-enrollments” topic.

        Spring Boot Kafka Producer

        To monitor the topic from the console, use the following command:

        Bash
        bin\windows\kafka-console-consumer.bat --topic student-enrollments --from-beginning --bootstrap-server localhost:9092
        Spring Boot kafka Consumer Output

        You can follow the same process to produce messages for the remaining topics, “student-enrollments” and “student-achievements,” and then check the corresponding output.

        Conclusion

        To recap, when you need to asynchronously share data between applications, consider using Apache Kafka, a message queue system. Kafka functions in a cluster of brokers, and this guide is aimed at helping beginners set up Kafka with Spring Boot. After setup, run both producer and consumer applications to facilitate data exchange through Kafka.

        For more detailed information on the Kafka producer application, you can clone the repository from this link: Kafka Producer Application Repository.

        Similarly, for insights into the Kafka consumer application, you can clone the repository from this link: Kafka Consumer Application Repository.

        These repositories provide additional resources and code examples to help you better understand and implement Kafka integration with Spring Boot.

        Spring Boot API Gateway Tutorial

        Spring-Boot-API-Gateway

        1. Introduction to Spring Boot API Gateway

        In this tutorial, we’ll explore the concept of a Spring Boot API Gateway, which serves as a centralized entry point for managing multiple APIs in a microservices-based architecture. The API Gateway plays a crucial role in handling incoming requests, directing them to the appropriate microservices, and ensuring security and scalability. By the end of this tutorial, you’ll have a clear understanding of how to set up a Spring Boot API Gateway to streamline your API management.

        2. Why Use an API Gateway?

        In a microservices-based architecture, your project typically involves numerous APIs. The API Gateway simplifies the management of all these APIs within your application. It acts as the primary entry point for accessing any API provided by your application.

        Spring Boot API Gateway

        3. Setting Up the Spring Boot API Gateway

        To get started, you’ll need to create a Spring Boot application for your API Gateway. Here’s the main class for your API Gateway application:

        Java
        package com.javadzone.api.gateway;
        
        import org.springframework.boot.SpringApplication;
        import org.springframework.boot.autoconfigure.SpringBootApplication;
        import org.springframework.cloud.client.discovery.EnableDiscoveryClient;
        
        @EnableDiscoveryClient
        @SpringBootApplication
        public class SpringApiGatewayApplication {
        	
        	public static void main(String[] args) {
        		SpringApplication.run(SpringApiGatewayApplication.class, args);
        	}
        	
        }
        

        In this class, we use the @SpringBootApplication annotation to mark it as a Spring Boot application. Additionally, we enable service discovery by using @EnableDiscoveryClient, which allows your API Gateway to discover other services registered in the service registry.

        3.1 Configuring Routes

        To configure routes for your API Gateway, you can use the following configuration in your application.yml or application.properties file:

        YAML
        server:
          port: 7777
          
        spring:
          application:
            name: api-gateway
          cloud:
            gateway:
              routes:
                - id: product-service-route
                  uri: http://localhost:8081
                  predicates:
                    - Path=/products/**
                - id: order-service-route  
                  uri: http://localhost:8082 
                  predicates:
                    - Path=/orders/**

        In this configuration:

        • We specify that our API Gateway will run on port 7777.
        • We give our API Gateway application the name “api-gateway” to identify it in the service registry.
        • We define two routes: one for the “inventory-service” and another for the “order-service.” These routes determine how requests to specific paths are forwarded to the respective microservices.

        3.2 Spring Boot API Gateway Dependencies

        To build your API Gateway, make sure you include the necessary dependencies in your pom.xml file:

        XML
        <dependencies>
            <dependency>
                <groupId>org.springframework.boot</groupId>
                <artifactId>spring-boot-starter-webflux</artifactId>
            </dependency>
            <dependency>
                <groupId>org.springframework.cloud</groupId>
                <artifactId>spring-cloud-starter-bootstrap</artifactId>
            </dependency>
            <dependency>
                <groupId>org.springframework.cloud</groupId>
                <artifactId>spring-cloud-starter-netflix-eureka-client</artifactId>
            </dependency>
            <dependency>
                <groupId>org.springframework.cloud</groupId>
                <artifactId>spring-cloud-starter-gateway</artifactId>
            </dependency>
        </dependencies>

        4. Running the Microservices

        To complete the setup and fully experience the functionality of the Spring Boot API Gateway, you should also run the following components:

        4.1. Clone the Repositories:

        Clone the repositories for the services by using the following GitHub links:

        If you’ve already created the API Gateway using the provided code above, there’s no need to clone it again. You can move forward with starting the services and testing the API Gateway as previously described. if not create api gateway you clone from this repo Spring Boot API Gateway Repository.

        You can use Git to clone these repositories to your local machine. For example:

        Bash
        git clone https://github.com/askPavan/inventory-service.git
        git clone https://github.com/askPavan/order-service.git
        git clone https://github.com/askPavan/spring-api-gateway.git
        git clone https://javadzone.com/eureka-server/

        4.2. Build and Run the Services:

        For each of the services (Inventory Service, Order Service, Eureka Server) and the API Gateway, navigate to their respective project directories in your terminal.

        • Navigate to the “Services/apis” directory.
        • Build the application using Maven:
        Bash
        mvn clean install

        You can begin running the services by executing the following command:

        Bash
        java -jar app-name.jar

        Please replace “app-name” with the actual name of your API or service. Alternatively, if you prefer, you can also start the services directly from your integrated development environment (IDE).

        4.3. Start Eureka Server:

        You can run the Eureka Server using the following command:

        Bash
        java -jar eureka-server.jar

        Make sure that you’ve configured the Eureka Server according to your application properties, as mentioned earlier.

        When you access the Eureka server using the URL http://localhost:8761, you will be able to view the services that are registered in Eureka. Below is a snapshot of what you will see.

        Spring Boot API Gateway

        4.4. Test the API Gateway and Microservices:

        Once all the services are up and running, you can test the API Gateway by sending requests to it. The API Gateway should route these requests to the respective microservices (e.g., Inventory Service and Order Service) based on the defined routes.

        Get All Products:

        When you hit the endpoint http://localhost:7777/products using a GET request, you will receive a JSON response containing a list of products:

        JSON
        [
            {
                "id": 1,
                "name": "Iphone 15",
                "price": 150000.55
            },
            {
                "id": 2,
                "name": "Samsung Ultra",
                "price": 16000.56
            },
            {
                "id": 3,
                "name": "Oneplus",
                "price": 6000.99
            },
            {
                "id": 4,
                "name": "Oppo Reno",
                "price": 200000.99
            },
            {
                "id": 5,
                "name": "Oneplus 10R",
                "price": 55000.99
            }
        ]
        

        Get a Product by ID:

        When you hit an endpoint like http://localhost:7777/products/{id} (replace {id} with a product number) using a GET request, you will receive a JSON response containing details of the specific product:

        JSON
        {
            "id": 2,
            "name": "Samsung Ultra",
            "price": 16000.56
        }
        

        Create a Product Order:

        You can create a product order by sending a POST request to http://localhost:7777/orders/create. Include the necessary data in the request body. For example:

        JSON
        {
            "productId": 1234,
            "userId": "B101",
            "quantity": 2,
            "price": 1000.6
        }
        

        You will receive a JSON response with the order details.

        JSON
        {
            "id": 1,
            "productId": 1234,
            "userId": "B101",
            "quantity": 2,
            "price": 1000.6
        }
        

        Fetch Orders:

        To fetch orders, send a GET request to http://localhost:8082/orders. You will receive a JSON response with order details similar to the one created earlier.

        JSON
        {
            "id": 1,
            "productId": 1234,
            "userId": "B101",
            "quantity": 2,
            "price": 1000.6
        }
        

        By following these steps and using the provided endpoints, you can interact with the services and API Gateway, allowing you to understand how they function in your microservices architecture.

        For more detailed information about the Spring Boot API Gateway, please refer to this repository: Spring Boot API Gateway Repository.

        FAQs

        Q1. What is an API Gateway? An API Gateway serves as a centralized entry point for efficiently managing and directing requests to microservices within a distributed architecture.

        Q2. How does load balancing work in an API Gateway? Load balancing within an API Gateway involves the even distribution of incoming requests among multiple microservices instances, ensuring optimal performance and reliability.

        Q3. Can I implement custom authentication methods in my API Gateway? Absolutely, you have the flexibility to implement custom authentication methods within your API Gateway to address specific security requirements.

        Q4. What is the role of error handling in an API Gateway? Error handling within an API Gateway plays a crucial role in ensuring that error responses are clear and informative. This simplifies the process of diagnosing and resolving issues as they arise.

        Q5. How can I monitor the performance of my API Gateway in a production environment? To monitor the performance of your API Gateway in a production setting, you can leverage monitoring tools and metrics designed to provide insights into its operational efficiency.

        Feel free to reach out if you encounter any issues or have any questions along the way. Happy coding!

        Singleton Design Pattern in Java: Handling All Cases

        Singleton design pattern in java

        The Singleton Design Pattern: A widely-used and classic design pattern. When a class is designed as a singleton, it ensures that only one instance of that class can exist within an application. Typically, we employ this pattern when we need a single global access point to that instance.

        1. How to create a singleton class


        To make a class a singleton, you should follow these steps:

        a) Declare the class constructor as private: By declaring the class constructor as private, you prevent other classes in the application from creating objects of the class directly. This ensures that only one instance is allowed.

        b) Create a static method: Since the constructor is private, external classes cannot directly call it to create objects. To overcome this, you can create a static method within the class. This method contains the logic for checking and returning a single object of the class. Since it’s a static method, it can be called without the need for an object. This method is often referred to as a factory method or static factory method.

        c) Declare a static member variable of the same class type: In the static method mentioned above, you need to keep track of whether an object of the class already exists. To achieve this, you initially create an object and store it in a member variable. In subsequent calls to the method, you return the same object stored in the member variable. However, member variables cannot be accessed directly in static methods, so you declare the member variable as a static variable to hold the reference to the class’s single instance.

        Here’s a sample piece of code to illustrate these concepts:

        The UML representation of the singleton pattern is as follows:

        Singleton Design Pattern in Java

        Important points to keep in mind:

        • The CacheManager() constructor is declared as private.
        • The class contains a static variable named instance.
        • The getInstance() method is static and serves as a factory method for creating instances of the class.
        Java
        public class CacheManager {
        
        	// Declare a static member of the same class type.
        	private static CacheManager instance;
        
        	// Private constructor to prevent other classes from creating objects.
        	private CacheManager() {
        	}
        
        	// Declare a static method to create only one instance.
        	private static CacheManager getInstance() {
        		if (instance == null) {
        			instance = new CacheManager();
        		}
        		return instance;
        	}
        }
        

        We can express the above code in various alternative ways, and there are numerous methods to enhance its implementation. Let’s explore some of those approaches in the sections below.

        1.1 Eager Initialization

        In the previous code, we instantiated the instance on the first call to the getInstance() method. Instead of deferring instantiation until the method is called, we can initialize it eagerly, well before the class is loaded into memory, as demonstrated below:

        Java
        public class CacheManager {
        
        	// Instantiate the instance object during class loading.
        	private static CacheManager instance = new CacheManager();
        
        	private CacheManager() {
        	}
        
        	private static CacheManager getInstance() {
        		return instance;
        	}
        }
        

        1.2 Static Block Initialization

        If you are familiar with the concept of static blocks in Java, you can utilize this concept to instantiate the singleton class, as demonstrated below:

        Java
        public class CacheManager {
        
        	private static CacheManager instance;
        
        	// The static block executes only once when the class is loaded.
        	static {
        		instance = new CacheManager();
        	}
        
        	private CacheManager() {
        	}
        
        	private static CacheManager getInstance() {
        		return instance;
        	}
        }
        

        However, the drawback of the above code is that it instantiates the object even when it’s not needed, during class loading.

        1.3 Lazy Initialization

        In many cases, it’s advisable to postpone the creation of an object until it’s actually needed. To achieve this, we can delay the instantiation process until the first call to the getInstance() method. However, a challenge arises in a multithreaded environment when multiple threads are executing simultaneously; it might lead to the creation of more than one instance of the class. To prevent this, we can declare the getInstance() method as synchronized.

        1.4 Override clone() Method and Throw CloneNotSupportedException

        To prevent a singleton class from being cloneable, it is recommended to implement the class from the Cloneable interface and override the clone() method. Within this method, we should throw CloneNotSupportedException to prevent cloning of the object. The clone() method in the Object class is protected and not visible outside the class, unless it is overridden. So, it’s important to implement Cloneable and throw an exception in the clone() method.

        However, there’s a problem with the above code. After the first call to getInstance(), subsequent calls to the method will still check the instance == null condition, even though it’s not necessary. Acquiring and releasing locks are costly operations, and we should minimize them. To address this issue, we can implement a double-check for the condition.

        Additionally, it’s recommended to declare the static member instance as volatile to ensure thread-safety in a multi-threaded environment.

        1.5 Serialization and Deserialization Issue

        Serialization and deserialization of a singleton class can create multiple instances, violating the singleton rule. To address this, we need to implement the readResolve() method within the singleton class. During the deserialization process, the readResolve() method is called to reconstruct the object from the byte stream. By implementing this method and returning the same instance, we can avoid the creation of multiple objects even during serialization and deserialization.

        Now, let’s revisit the provided code to address the issue:

        Java
        public class CacheSerialization {
        
        	public static void main(String[] args) throws FileNotFoundException, IOException, ClassNotFoundException {
        	
        		CacheManager cacheManager1 = CacheManager.getInstance();
        		ObjectOutputStream oos = new ObjectOutputStream(new FileOutputStream(
        				new File("D:\\cacheManager.ser")));
        		oos.writeObject(cacheManager1);
        
        		CacheManager cacheManager2 = null;
        		ObjectInputStream ois = new ObjectInputStream(new FileInputStream(
        				new File("D:\\cacheManager.ser")));
        		cacheManager2 = (CacheManager) ois.readObject();
        
        		System.out.println("cacheManager1 == cacheManager2 :  " + (cacheManager1 == cacheManager2)); // false
        	}
        }

        In this code, you’re experiencing an issue where cacheManager1 and cacheManager2 instances do not behave as expected after deserialization it return false. This discrepancy indicates the creation of duplicate objects, which contradicts the desired behavior of a singleton pattern.

        To resolve this issue, you can rectify your CacheManager class by adding a readResolve() method. This method ensures that only one instance is maintained throughout the deserialization process, thereby preserving the correct behavior of the singleton pattern.

        Here is the final version of the singleton class, which addresses all the relevant cases:

        Java
        import java.io.Serializable;
        
        public class CacheManager implements Serializable, Cloneable {
            private static volatile CacheManager instance;
        
            // Private constructor to prevent external instantiation.
            private CacheManager() {
            }
        
            // Method to retrieve the singleton instance.
            private static CacheManager getInstance() {
                if (instance == null) {
                    synchronized (CacheManager.class) {
                        // Double-check to ensure a single instance is created.
                        if (instance == null) {
                            instance = new CacheManager();
                        }
                    }
                }
                return instance;
            }
        
            // This method is called during deserialization to return the existing instance.
            public Object readResolve() {
                return instance;
            }
        
            // Prevent cloning by throwing CloneNotSupportedException.
            @Override
            public Object clone() throws CloneNotSupportedException {
                throw new CloneNotSupportedException();
            }
        }
        

        In conclusion, the provided code defines a robust implementation of the Singleton Design Pattern in Java. It guarantees that only one instance of the CacheManager class is created, even in multithreaded environments, thanks to double-checked locking and the use of the volatile keyword.

        Moreover, it addresses potential issues with serialization and deserialization by implementing the readResolve() method, ensuring that only a single instance is maintained throughout the object’s lifecycle. Additionally, it prevents cloning of the singleton object by throwing CloneNotSupportedException in the clone() method.

        Conclusion: Ensuring Singleton Design Pattern Best Practices

        In summary, this code exemplifies a well-rounded approach to creating and safeguarding a singleton class while adhering to best practices and design principles.

        Java 21 Features With Examples

        Java 21 features with examples

        Java 21 brings some exciting new features to the world of programming. In this article, we’ll explore these Java 21 features with practical examples to make your Java coding experience even better.

        Please download OpenJDK 21 and add it to the PATH environment variable before switching to Java 21.

        Java 21 Features:

        1. Pattern Matching for Switch

        Java 21 brings a powerful feature called Pattern Matching for Switch. It simplifies switch statements, making them more concise and readable. Check out an example:

        Before java 21

        // Before Java 21
        String response = "yes";
        
        switch (response) {
            case "yes":
            case "yeah":
                System.out.println("You said yes!");
                break;
            case "no":
            case "nope":
                System.out.println("You said no!");
                break;
            default:
                System.out.println("Please choose.");
        }
        

        In Java 21, you can rewrite the code provided above as follows:

        // Java 21 Pattern Matching for Switch
        String response = "yes";
        switch (response) {
            case "yes", "yeah" -> System.out.println("You said yes!");
            case "no", "nope" -> System.out.println("You said no!");
            default -> System.out.println("Please choose.");
        }
        

        Explore more about Pattern Matching for Switch in the full article.

        2. Unnamed Patterns and Variables

        Java 21 introduces Unnamed Patterns and Variables, making your code more concise and expressive. Here is a short part to show you an example:

        String userInput = "User Input"; 
        
        try { 
            int number = Integer.parseInt(userInput);
            // Use 'number'
        } catch (NumberFormatException ex) { 
            System.out.println("Invalid input: " + userInput);
        }
        

        Now, with Java 21, the above code can be rewritten as follows

        String userInput = "User Input"; 
        
        try { 
            int number = Integer.parseInt(userInput);
            // Use 'number'
        } catch (NumberFormatException _) { 
            System.out.println("Invalid input: " + userInput);
        }
        

        In this updated version, we no longer use the ‘ex’ variable; instead, we’ve replaced it with an underscore (_). This simple change helps streamline the code and makes it more concise.

        For a deep dive into this feature and more practical examples, visit the full article.

        3. Unnamed Classes and Instance Main Methods

        Java 21 introduces a fresh approach to defining classes and instance main methods right in your code. Let’s take a quick look at how this feature operates:

        // Java 21 Examples of Classes Without Names and Main Methods Inside Instances
        public class UnnamedClassesDemo {
            void main(String[] args) {
                System.out.println("Hello from an unnamed class!");
            }
        }
        

        Explore more about unnamed classes and instance main methods in the full article.

        4. String Templates in Java

        Java 21 introduces String Templates, simplifying string concatenation. Take a look:

        // Java (using String.format)
        String name = "Sachin P";
        String message = String.format("Welcome %s", Java);
        

        In Java 21, you can create a message using this syntax:

        String name = "Sachin P";
        String message = STR."Welcome  \{name}!";
        

        Discover the power of string templates and practical examples in the full article.

        5. Sequenced Collections in Java 21

        Java 21 introduces Sequenced Collections, making it easier to work with ordered data. Here’s a glimpse:

        List<Integer> list = new ArrayList<Integer>(); 
        	list.add(0);
        	list.add(1);
        	list.add(2);
        	
        	// Fetch the first element (element at index 0)
        	int firstElement = list.get(0);
        	
        	// Fetch the last element
        	int lastElement = list.get(list.size() - 1);
        
        

        In Java 21, you can retrieve elements using the following code.

        List<Integer> list = new ArrayList<Integer>(); 
        	list.add(0);
        	list.add(1);
        	list.add(2);
        	
        	// Fetch the first element (element at index 0)
        	int firstElement = list.getFirst();
        	
        	// Fetch the last element
        	int lastElement = list.getLast();
        

        Learn more about SequencedCollection, SequencedSet and SequencedMap and explore practical examples in the full article.

        To put it simply, Java 21 brings some exciting improvements to the language. Features like Unnamed Patterns and Variables, along with Pattern Matching for Switch, make coding easier and improve code readability. These enhancements make Java development more efficient and enjoyable. Java developers now have the tools to write cleaner and more expressive code, marking a significant step forward in the world of Java programming.

        If you’re curious to explore more features and details about Java 21, I recommend checking out the official Java release notes available at this link: Java 21 Release Notes. These release notes provide comprehensive information about all the changes and enhancements introduced in Java 21.

        Java 21 Pattern Matching for Switch Example

        Java has been constantly evolving to meet the demands of modern programming. With the release of Java 21, a notable feature called Java 21 Pattern Matching for Switch has been introduced. In this article, we’ll explore what this feature is all about, how it works, and see some real-world examples to understand its practical applications.

        Introducing Java 21’s Pattern Matching for Switch

        Java 21 brings a significant improvement known as Pattern Matching for Switch, which revolutionizes the way we handle switch statements. This feature makes code selection more straightforward by letting us use patterns in case labels. It not only improves code readability but also reduces redundancy and simplifies complex switch statements.

        How Pattern Matching Improves Switch Statements

        Pattern Matching allows developers to utilize patterns in case labels, making it easier to match and extract components from objects. This eliminates the need for casting and repetitive instanceof checks, resulting in cleaner and more efficient code. Let’s dive into some practical examples to understand how Pattern Matching functions in real-world scenarios.

        Practical Examples

        Example 1: Matching a Specific Value

        Consider a scenario where you need to categorize shapes based on their names. In traditional switch statements, you might do the following:

        switch (shape) {
            case "circle":
                System.out.println("Handling circle logic");
                break;
            case "rectangle":
                System.out.println("Handling rectangle logic");
                break;
            case "triangle":
                System.out.println("Handling triangle logic");
                break;
            default:
                // Handle other cases
        }
        

        With Pattern Matching, you can simplify above code:

        switch (shape) {
            case "circle" -> {
            	 System.out.println("Handling circle logic");
            }
            case "rectangle" -> {
                System.out.println("Handling rectangle logic");
            }
            case "triangle" -> {
               System.out.println("Handling triangle logic");
            }
            default -> {
                // Handle other cases
            }
        }
        

        Pattern Matching allows for a more concise and readable switch statement.

        Example 2: Matching Complex Objects

        Pattern Matching can also simplify code when dealing with complex objects. Suppose you have a list of vehicles, and you want to perform specific actions based on the vehicle type:

        for (Object v : vehicles) {
            if (v instanceof Car) {
                Car car = (Car) v;
                //car logic
            } else if (v instanceof Scooter) {
                Scooter scooter = (Scooter) v;
                //scooter logic
            } else if (v instanceof Jeep) {
                Jeep jeep = (Jeep) v;
                //jeep logic
            }
        }
        

        The code above can be rewritten using Pattern Matching.

        for (Object v : vehicles) {
        
            return switch (v) {
                case Car car -> {
                    //car logic
                }
                case Scooter scooter -> {
                    //scooter logic
                }
                case Jeep jeep -> {
                    //jeep logic
                }
            }
        }
        

        Pattern Matching simplifies the code and eliminates the need for explicit casting.

        Example 3: Java 21 – Handling Null Cases in Switch Statements

        Before Java 21, switch statements and expressions posed a risk of throwing NullPointerExceptions when the selector expression was evaluated as null.

        public void handleGreetings(String s) {
         // If 's' is null and we don't handle it, it will result in a NullPointerException.
            if (s == null) {
                System.out.println("No message available.");
                return;
            }
            switch (s) {
                case "Hello", "Hi" -> System.out.println("Greetings!");
                case "Bye" -> System.out.println("Goodbye!");
                default -> System.out.println("Same to you!");
            }
        }
        

        Java 21 Introduces a New Null Case Label. above code can rewritten like this

        public void handleGreetingsInJava21(String s) {
            switch (s) {
                case null           -> System.out.println("No message available.");
                case "Hello", "Hi" -> System.out.println("Hello there!");
                case "Bye"         -> System.out.println("Goodbye!");
                default            -> System.out.println("Same to you!");
            }
        }
        

        Example 4: Java 21 Pattern Matching with Guards

        In Java 21, pattern case labels can apply to multiple values, leading to conditional code on the right-hand side of a switch rule. However, we can now simplify our code using guards, allowing ‘when’ clauses in switch blocks to specify guards to pattern case labels.

        Before Java 21.

        public void testInput(String response) {
        
            switch (response) {
                case null -> System.out.println("No message available.");
                case String s when s.equalsIgnoreCase("MAYBE") -> {
                    System.out.println("Not sure, please decide.");
                }
                case String s when s.equalsIgnoreCase("EXIT") -> {
                    System.out.println("Exiting now.");
                }
                default -> System.out.println("Please retry.");
            }
        }
        

        Using Java 21 – Simplified Code

        public void test21Input(String response) {
        
            switch (response) {
                case null -> System.out.println("No message available.");
                case String s when s.equalsIgnoreCase("MAYBE") -> {
                    System.out.println("Not sure, please decide.");
                }
                case String s when s.equalsIgnoreCase("EXIT") -> {
                    System.out.println("Exiting now.");
                }
                default -> System.out.println("Please retry.");
            }
        }
        

        With Java 21’s new features, your code becomes more concise and easier to read, making pattern matching a powerful tool in Java programming toolkit.

        Benefits of Java 21

        1. Improved code readability
        2. Reduced boilerplate code
        3. Simplified complex switch statements
        4. Enhanced developer productivity

        In conclusion, Java 21 Pattern Matching for Switch is a valuable addition to the Java language, making code selection more straightforward and efficient. By using patterns in switch statements, developers can write cleaner, more concise, and more readable code, ultimately improving software quality and maintainability.

        For additional information on pattern matching in Java, you can visit the following link: Pattern Matching (JEP 441) – OpenJDK. This link provides detailed information about the Java Enhancement Proposal (JEP) for pattern matching in Java.

        Java 21 Unnamed Patterns and Variables with Examples

        Java 21 Unnamed Patterns and Variables is introduced as a preview feature JEP-443 that simplifies data processing. It enables the use of unnamed patterns and variables, denoted by an underscore character (_), to match components within data structures without specifying their names or types. Additionally, you can create variables that are initialized but remain unused in the code.

        Let’s break this down in simpler terms:

        Introduction:

        Before we dive into the world of Java records, let’s consider a situation where the conciseness of code
        presents a challenge. In this instance, we will work with two record types: “Team” and “Member.”

        record ProjectInfo(Long projectID, String projectName, Boolean isCompleted) {
          // Constructor and methods (if any)
        }
        
        record TeamMemberInfo(Long memberID, String memberName, LocalDate joinDate, Boolean isActive, ProjectInfo projectInfo) {
          // Constructor and methods (if any)
        }
        

        In Java, records provide a streamlined approach to create immutable data structures, particularly suitable for storing plain data. They eliminate the need for traditional getter and setter methods.

        Now, let’s delve into how record patterns can simplify code by deconstructing instances of these records into their constituent components.

        TeamMemberInfo teamMember = new TeamMemberInfo(101L, "Alice", LocalDate.of(1985, 8, 22), true, projectInfo);
        
        if (teamMember instanceof TeamMemberInfo(Long id, String name, LocalDate joinDate, Boolean isActive, ProjectInfo projInfo)) {
          System.out.printf("Team member %d joined on %s.", id, joinDate); 
          //Team member 101 joined on 1985-8-22
        }
        

        When working with record patterns, it’s often the case that we require only certain parts of the record and not all of them.
        In above example, we exclusively used the “id” and “joinDate” components.
        The presence of other components such as “name,” “isActive,” and “projInfo” doesn’t enhance clarity; instead, they add brevity without improving readability.

        In Java 21, this new feature is designed to eliminate this brevity.

        Exploring Unnamed Patterns and Variables

        In Java 21, a new feature introduces the use of underscores (_) to represent record components and local variables, indicating our lack of interest in them.

        With this new feature, we can revise the previous example more concisely as shown below.
        It’s important to observe that we’ve substituted the “name,” “isActive,” and “projInfo” components with underscores (_).

        if (teamMember instanceof TeamMemberInfo(Long id, _, LocalDate joinDate, _, _)) {
          System.out.printf("Team member %d joined on %s.", id, joinDate); //Team member 101 joined on 1985-8-22
        }
        

        In a similar manner, we can employ the underscore character with nested records when working with the TeamMemberInfo record,
        especially when we don’t need to use certain components. For example, consider the following scenario where we only
        require the team member’s ID for specific database operations, and the other components are unnecessary.

        if (teamMember instanceof TeamMemberInfo(Long id, _, _, _, _)) {
          // Utilize the team member's ID
          System.out.println("Team Member ID is: " + id);  //Team Member ID is: 101
        }
        

        In this code, the underscore (_) serves as a placeholder for the components we don’t need to access
        within the TeamMemberInfo record.

        Starting from Java 21, you can use unnamed variables in these situations:

        1. Within a local variable declaration statement in a code block.
        2. In the resource specification of a ‘try-with-resources’ statement.
        3. In the header of a basic ‘for’ statement.
        4. In the header of an ‘enhanced for loop.’
        5. As an exception parameter within a ‘catch’ block.
        6. As a formal parameter within a lambda expression.

        Java 21 Unnamed Patterns and Variables Practical Examples

        Let’s dive into a few practical examples to gain a deeper understanding.

        Example 1: Local Unnamed Variable

        Here, we create a local unnamed variable to handle a situation where we don’t require the result.

        int _ = someFunction(); // We don't need the result
        

        Example 2: Unnamed Variable in a ‘catch’ Block

        In this case, we use an unnamed variable within a ‘catch’ block to handle exceptions without utilizing the caught value.

        String userInput = "Your input goes here"; 
        
        try { 
            int number = Integer.parseInt(userInput);
            // Use 'number'
        } catch (NumberFormatException _) { 
            System.out.println("Invalid input: " + userInput);
        }
        

        Example 3: Unnamed Variable in a ‘for’ Loop

        In the following example, we employ an unnamed variable within a simple ‘for’ loop, where the result of the runOnce() function is unused.

        for (int i = 0, _ = runOnce(); i < array.length; i++) {
          // ... code that utilizes 'i' ...
        }
        
        

        Example 4: Unnamed Variable in Lambda Expression

        // Define a lambda expression with an unnamed parameter
         Consumer<String> printMessage = (_) -> {
         		System.out.println("Hello, " + _);
         };
        
        // Use the lambda expression with an unnamed parameter
        printMessage.accept("John"); //Hello, John
        

        Example 5: Unnamed Variable in try-with-resources

        try (var _ = DatabaseConnection.openConnection()) {
            ... no use of the established database connection ...
        }
        

        In all the above examples, where the variables remain unused and their names are irrelevant, we simply declare them without providing a name, using the underscore (_) as a placeholder. This practice enhances code clarity and reduces unnecessary distractions.

        Conclusion
        Java 21 introduces a convenient feature where you can use underscores (_) as placeholders for unnamed variables. This simplifies your code by clearly indicating that certain variables are intentionally unused within their specific contexts. You can apply unnamed variables in multiple situations, such as local variable declarations, ‘try-with-resources’ statements, ‘for’ loop headers, ‘enhanced for’ loops, ‘catch’ block parameters, and lambda expressions. This addition to Java 21 improves code readability and helps reduce unnecessary clutter when you need to declare variables but don’t actually use them in your code.

        Java 21 Unnamed Classes and Instance Main Methods

        Java is evolving to make it easier for beginners to start coding without the complexity of large-scale programming. With the introduction of Unnamed Classes in Java 21, this enhancement allows students to write simple programs initially and gradually incorporate more advanced features as they gain experience. This feature aims to simplify the learning curve for newcomers.

        Simplifying a Basic Java Program

        Think about a straightforward Java program, like one that calculates the sum of two numbers:

        public class AddNumbers { 
        
            public static void main(String[] args) { 
                int num1 = 5;
                int num2 = 7;
                int sum = num1 + num2;
                System.out.println("The sum is: " + sum); //12
            }
        }
        

        This program may appear more complicated than it needs to be for such a simple task. Here’s why:

        • The class declaration and the mandatory public access modifier are typically used for larger programs but are unnecessary here.
        • The String[] args parameter is designed for interacting with external components, like the operating system’s shell. However, in this basic program, it serves no purpose and can confuse beginners.
        • The use of the static modifier is part of Java’s advanced class-and-object model. For beginners, it can be perplexing. To add more functionality to this program, students must either declare everything as static (which is unconventional) or Learn about static and instance members and how objects are created.

        In Java 21, making it easier for beginners to write their very first programs without the need to understand complex features designed for larger applications. This enhancement involves two key changes

        1. Instance Main Methods:

        The way Java programs are launched is changing, allowing the use of instance main methods. These methods don’t need to be static, public, or have a String[] parameter. This modification enables simplifying the traditional “Hello, World!” program to something like this:

        class GreetingProgram {
            void greet() {
                System.out.println("Hello, World!");
            }
        }
        

        Execute the program using the following command: java --source 21 --enable-preview GreetingProgram.java

        Output

        Hello, World!

        2. Unnamed Classes:

        Java programmers introducing unnamed classes to eliminate the need for explicit class declarations, making the code cleaner and more straightforward:

        void main() {
            System.out.println("Welcome to Java 21 Features");
        }
        

        Save this file with a name of your choice, then run the program using the following command: java --source 21 --enable-preview YourFileName.java

        Ensure that you replace “YourFileName” with the actual name of your file.

        Output:

        Welcome to Java 21 Features
        

        Please find the reference output below.

        Java 21 Unnamed Classes and Instance Main Methods

        Please note that these changes are part of a preview language feature, and they are disabled by default. To try them out in JDK 21, you can enable preview features using the following commands:

        Bash
        Compile the program with: javac --release 21 --enable-preview FileName.java
        
        run it with: java --source 21 --enable-preview FileName.java

        You can find more information about these features on the official OpenJDK website by visiting the following link: Java Enhancement Proposal 443 (JEP 443).

        To sum it up, Java 21 is bringing some fantastic improvements to make programming more beginner-friendly and code cleaner. With the introduction of instance main methods and unnamed classes, Java becomes more accessible while maintaining its strength. These changes mark an exciting milestone in Java’s evolution, making it easier for newcomers to dive into coding. Since these features are in preview, developers have the opportunity to explore and influence the future of Java programming.

        Java String Templates in Java 21: Practical Examples

        What exactly is a String Template?

        Template strings, often known as Java string templates, are a common feature found in many programming languages, including TypeScript template strings and Angular interpolation. Essentially, they allow us to insert variables into a string, and the variable values are determined at runtime. As a result, Java string templates generate varying output based on the specific values of the variables.

        Here are examples of Java string templates with different greetings and names:

        // TypeScript
        const nameTS = "John";
        const messageTS = `Welcome ${nameTS}!`;
        
        // Angular
        const nameAngular = "Jane";
        const messageAngular = `Welcome {{ ${nameAngular} }}!`;
        
        // Python
        namePython = "Alice"
        messagePython = "Welcome {namePython}!"
        
        // Java (using String.format)
        String nameJava = "Bob";
        String messageJava = String.format("Welcome %s", nameJava);
        

        Each of the examples provided above will give you the same result when used with the same ‘name’ variable. JEP-430 is an effort to introduce template string support in the Java programming language, much like what you see here:

        In Java 21, you can create a message using this syntax:

        String name = "Bob";
        String message = STR."Welcome  \{name}!";
        

        String Templates in the Java Language

        The Old-Fashioned Way

        String formatting in Java is not a new concept. Historically, programmers have employed various methods to create formatted strings, including string concatenation, StringBuilder, String.format(), and the MessageFormat class.

        public class WelcomeMessage {
        
            public static void main(String[] args) {
                String message;
                String name = "John";
        
                // Concatenate a welcome message
                message = "Welcome " + name + "!";
        
                // Use String.format for formatting
                message = String.format("Welcome %s!", name);
        
                // Format using MessageFormat
                message = new MessageFormat("Welcome {0}!").format(new Object[] { name });
        
                // Construct efficiently with StringBuilder
                message = new StringBuilder().append("Welcome ").append(name).append("!").toString();
        
                // Display the final welcome message
                System.out.println(message);
            }
        }
        

        the output for each method will be the same, and it will display:

        Welcome John!

        Java String Templates in Java 21: Secure Interpolation

        Java 21 has introduced template expressions, drawing inspiration from other programming languages. These expressions enable dynamic string interpolation during runtime. What sets Java’s approach apart is its focus on minimizing security risks, particularly when handling string values within SQL statements, XML nodes, and similar scenarios.

        In terms of syntax, a template expression resembles a regular string literal with a specific prefix:

        // Code Example
        String message = STR."Greetings \{name}!";
        

        In this context:

        • STR represents the template processor.
        • There is a dot operator (.) connecting the processor and the expression.
        • The template string contains an embedded expression in the form of {name}.
        • The outcome of the template processor, and consequently the result of evaluating the template expression, is often a String—although this isn’t always the case.

        Template Processors in Java 21

        In the world of Java, you’ll encounter three distinct template processors:

        STR: This processor takes care of standard interpolation, making it a versatile choice for string manipulation.

        FMT: Unlike its counterparts, FMT not only performs interpolation but also excels at interpreting format specifiers located to the left of embedded expressions. These format specifiers are well-documented within Java’s Formatter class.

        RAW: RAW stands as a steadfast template processor, primarily generating unprocessed StringTemplate objects.

        Here’s an example demonstrating how each of these processors can be utilized:

        Here’s an example demonstrating how each of these processors can be utilized:

        import static java.lang.StringTemplate.STR;
        import static java.lang.StringTemplate.RAW;
        
        public class TemplateProcessorTest {
            public static void main(String[] args) {
                String name = "JavaDZone";
        
                System.out.println(STR."Welcome to \{name}");
                System.out.println(RAW."Welcome to \{name}.");
            }
        }
        

        To put it into action, execute the following command within your terminal or command prompt:

        Bash
        java --enable-preview --source 21 TemplateProcessorTest.java

        Be sure to substitute “TemplateProcessorTest.java” with the actual name of your Java class.

        Performing Arithmetic Operations within Expressions

        In Java 21, you have the capability to carry out arithmetic operations within expressions, providing you with the means to compute values and showcase the results directly within the expression itself.

        For instance, consider the following code snippet:

        int operand1 = 10, operand2 = 20;
        
        String resultMessage = STR."\{operand1} + \{operand2} = \{operand1 + operand2}";  // This will result in "10 + 20 = 30"
        

        You can use multi-line expressions:

        For the sake of improving code readability, you can split an embedded expression into multiple lines, emulating the style often seen in nested method calls resembling a builder pattern.

        Here’s an example to illustrate this:

        System.out.println(STR."The current date is: \{
            DateTimeFormatter.ofPattern("yyyy-MM-dd")
                .format(LocalDateTime.now())
        }");
        
        

        Exploring String Templates in Java 21

        The following Java class, StringTemplateTest, serves as an illustrative example of utilizing string templates with the STR template processor. It demonstrates how to integrate string interpolation and various expressions within template strings. Each section is accompanied by a description to provide a clear understanding of the usage.

        import static java.lang.StringTemplate.STR;
        import java.time.LocalDateTime;
        import java.time.format.DateTimeFormatter;
        import java.time.LocalTime;
        
        public class StringTemplateTest {
        
          private static String name = "JavaDZone";
          private String course = "Java21 Features";
          private static int a = 100;
          private static int b = 200;
        
          public static void main(String[] args) {
          
              // Using variable in template expression.
              System.out.println(STR."Welcome to \{name}");
        
               // Utilizing a method in the template expression.
              System.out.println(STR."Welcome to \{getName()}");
        
              
              StringTemplateTest st = new StringTemplateTest();
        
             // Using non-static variable in the template expression.
              System.out.println(STR."Welcome to \{st.course}");
        
               // Performing arithmetic operations within the expression.
              System.out.println(STR."\{a} + \{b} = \{a+b}");
        
                // Displaying the current date using expression
              System.out.println(STR."The current date is: \{DateTimeFormatter.ofPattern("yyyy-MM-dd").format(LocalDateTime.now())}");
              
              }
        
        
          public static String getName() {
            return name;
          }
          
        }
        

        To put it into action, execute the following command within your terminal or command prompt:

        Bash
        java --enable-preview --source 21 StringTemplateTest .java

        make sure to change “StringTemplateTest.java” with the actual name of your Java class.

        Java String Templates

        If you attempt to run or compile the StringTemplateTest class using the traditional java or javac methods, you may encounter the following error:

        Java String Templates in java 21

        This error message indicates that string templates are considered a preview feature in Java, and they are not enabled by default. To enable and utilize string templates in your code, you should use the --enable-preview –source 21 flag when running or compiling your Java program. This flag allows you to take advantage of string templates’ functionality.

        In summary, this Java tutorial has explored the concept of string templates in Java. This feature was introduced in Java 21 as a preview, offering a fresh addition to the language’s capabilities. To stay updated on potential improvements and enhancements to this feature, be sure to keep an eye on the Java release notes. Enjoy your learning journey!

        Sequenced Collections in java 21: Practical Examples

        In the world of Java programming, the introduction of Sequenced Collections in Java 21 has brought significant improvements to existing Collection classes and interfaces. This new feature allows easy access to both the first and last elements of a collection, thanks to the inclusion of default library methods. It also enables developers to obtain a reversed view of the collection with a simple method call.

        It’s important to clarify that in this context, “encounter order” does not refer to the physical arrangement of elements within the collection. Instead, it means that one element can be positioned either before (closer to the first element) or after (closer to the last element) another element in the ordered sequence.

        Let’s dive deeper into this exciting addition, which has been part of Java since the release of Java 21 JEP-431

        These newly introduced interfaces are

        1. SequencedCollection
        2. SequencedSet
        3. SequencedMap

        Now, let’s illustrate the power of Sequenced Collections in Java 21 with a practical example:

        Example: Managing a Playlist Imagine you’re developing a music streaming application in Java. In this application, you need to maintain a playlist of songs, allowing users to navigate easily between tracks. The introduction of Sequenced Collections becomes incredibly valuable in this scenario.

        By utilizing SequencedSet, you can ensure that songs in the playlist maintain a specific order, enabling users to move seamlessly from one song to the next or return to the previous one. Additionally, you can use SequencedCollection to manage song history, making it effortless for users to retrace their listening journey, starting from the first song they played to the most recent one.

        This real-life example illustrates how Sequenced Collections in Java 21 can enhance the user experience and streamline the management of ordered data in your applications.

        Sequenced Collections in java 21

        Sequenced Collections in Java 21 Made Simple

        The SequencedCollection interface introduces a set of methods to streamline the addition, retrieval, and removal of elements at both ends of a collection. It also offers the ‘reversed()’ method, which presents a reversed view of the collection. Worth noting is that, apart from ‘reversed()’, all these methods are default methods, accompanied by default implementations

        public interface SequencedCollection<E> extends Collection<E> {
        
            SequencedCollection<E> reversed();
            default void addFirst(E e) {
            }
            default void addLast(E e) {
            }
            default E getFirst() {
            }
            default E getLast() {
            }
            default E removeFirst() {
            }
            default E removeLast() {
            }
        }
        

        For instance, consider the following code snippet where we create an ArrayList and perform new sequenced operations on it:

        ArrayList<Integer> list = new ArrayList<>();
        
        list.add(10);          // Adds 10 to the list.
        list.addFirst(0);      // Adds 0 to the beginning of the list.
        list.addLast(20);      // Adds 20 to the end of the list.
        System.out.println("list: " + list);        // Output: list: [0, 10, 20]
        System.out.println(list.getFirst());         // Output: 0
        System.out.println(list.getLast());          // Output: 20
        System.out.println(list.reversed());        // Output: [20, 10, 0]
        
        

        This code demonstrates how Sequenced Collections simplify the management of ordered data within a collection, offering easy access to elements at both ends and providing a convenient method to view the collection in reverse order.

        SequencedSet: Streamlined Collection Sorting

        The SequencedSet interface is designed specifically for Set implementations, such as LinkedHashSet. It builds upon the SequencedCollection interface while customizing the ‘reversed()’ method. The key distinction lies in the return type of ‘SequencedSet.reversed()’, which is now ‘SequencedSet’.

        Sequencedset.class

        public interface SequencedSet<E> extends SequencedCollection<E>, Set<E> {
            SequencedSet<E> reversed();  // Overrides and specifies the return type for reversed() method.
        }
        

        Example: Using SequencedSet

        Let’s explore an example of how to utilize SequencedSet with LinkedHashSet:

        import java.util.*;
        
        
        public class SequencedSetExample {
        
            public static void main(String[] args) {
              LinkedHashSet<Integer> hashSet = new LinkedHashSet<>(List.of(5, 8, 12, 9, 10));
        
              System.out.println("LinkedHashSet contents: " + hashSet); // Output: [5, 8, 12, 9, 10]
              // First element in the LinkedHashSet.
              System.out.println("First element: " + hashSet.getFirst()); // Output: 5
              
              // Last element in the LinkedHashSet.
              System.out.println("Last element: " + hashSet.getLast()); // Output: 10
              
              // reversed view of the LinkedHashSet.
              System.out.println("Reversed view: " + hashSet.reversed()); // Output: [10, 9, 12, 8, 5]
            }
        
        }
        

        When you run this class, you’ll see the following output:

        YAML
        LinkedHashSet contents: [5, 8, 12, 9, 10]
        First element: 5
        Last element: 10
        Reversed view: [10, 9, 12, 8, 5]

        SequencedMap: Changing How Maps Are Ordered

        Understanding SequencedMap

        SequencedMap is a specialized interface designed for Map classes like LinkedHashMap, introducing a novel approach to managing ordered data within maps. Unlike SequencedCollection, which handles individual elements, SequencedMap offers its unique methods that manipulate map entries while considering their access order.

        Exploring SequencedMap Features

        SequencedMap introduces a set of default methods to enhance map entry management:

        • firstEntry(): Retrieves the first entry in the map.
        • lastEntry(): Retrieves the last entry in the map.
        • pollFirstEntry(): Removes and returns the first entry in the map.
        • pollLastEntry(): Removes and returns the last entry in the map.
        • putFirst(K k, V v): Inserts an entry at the beginning of the map.
        • putLast(K k, V v): Inserts an entry at the end of the map.
        • reversed(): Provides a reversed view of the map.
        • sequencedEntrySet(): Returns a SequencedSet of map entries, maintaining the encounter order.
        • sequencedKeySet(): Returns a SequencedSet of map keys, reflecting the encounter order.
        • sequencedValues(): Returns a SequencedCollection of map values, preserving the encounter order.

        Example: Utilizing SequencedMap

        LinkedHashMap<Integer, String> hashMap = new LinkedHashMap<>();
                hashMap.put(10, "Ten");
                hashMap.put(20, "Twenty");
                hashMap.put(30, "Thirty");
                hashMap.put(40, "Fourty");
                hashMap.put(50, "Fifty");
        
                System.out.println("hashmap: " + hashMap);
                // Output: {10=Ten, 20=Twenty, 30=Thirty, 40=Fourty, 50=Fifty}
        
                hashMap.put(0, "Zero");
                hashMap.put(100, "Hundred");
        
                System.out.println(hashMap); // {10=Ten, 20=Twenty, 30=Thirty, 40=Fourty, 50=Fifty, 0=Zero, 100=Hundred}
        
                // Fetching the first entry
                System.out.println("Fetching first entry: " + hashMap.entrySet().iterator().next());
                // Output: Fetching the first entry: 10=Ten
        
                // Fetching the last entry
                Entry<Integer, String> lastEntry = null;
                for (java.util.Map.Entry<Integer, String> entry : hashMap.entrySet()) {
        

        In the traditional approach, prior to Java 21, working with a LinkedHashMap to manage key-value pairs involved manual iteration and manipulation of the map. Here’s how it was done

        LinkedHashMap<Integer, String> hashMap = new LinkedHashMap<>();
                hashMap.put(10, "Ten");
                hashMap.put(20, "Twenty");
                hashMap.put(30, "Thirty");
                hashMap.put(40, "Fourty");
                hashMap.put(50, "Fifty");
        
                System.out.println("hashmap: " + hashMap);
                // Output: {10=Ten, 20=Twenty, 30=Thirty, 40=Fourty, 50=Fifty}
        
                hashMap.put(0, "Zero");
                hashMap.put(100, "Hundred");
        
                System.out.println(hashMap); // {10=Ten, 20=Twenty, 30=Thirty, 40=Fourty, 50=Fifty, 0=Zero, 100=Hundred}
        
        
                // Fetching the first entry
                System.out.println("Fetching first entry: " + hashMap.entrySet().iterator().next());
                // Output: Fetching the first entry: 10=Ten
        
        
                // Fetching the last entry
                Entry<Integer, String> lastEntry = null;
                for (java.util.Map.Entry<Integer, String> entry : hashMap.entrySet()) {
                    lastEntry = entry;
                }
                System.out.println("Fetching last entry: " + lastEntry); // Output: Fetching the last entry: 100=Hundred
        
        
                // Removing the first entry
                Entry<Integer, String> removedFirstEntry = hashMap.entrySet().iterator().next();
                hashMap.remove(removedFirstEntry.getKey());
                System.out.println("Removing first entry: " + removedFirstEntry);
                // Output: Removing the first entry: 10=Ten
        
        
                hashMap.remove(lastEntry.getKey());
                System.out.println("Removing last entry: " + lastEntry);
                // Output: Removing the last entry: 100=Hundred
        
        
                System.out.println("hashMap: " + hashMap);
                // Output after removals: {20=Twenty, 30=Thirty, 40=Fourty, 50=Fifty, 0=Zero}
        
        
                LinkedHashMap<Integer, String> reversedMap = new LinkedHashMap<>();
                List<Entry<Integer, String>> entries = new ArrayList<>(hashMap.entrySet());
        
                Collections.reverse(entries);
        
                for (Entry<Integer, String> entry : entries) {
                    reversedMap.put(entry.getKey(), entry.getValue());
                }
        
        
                System.out.println("Reversed view of the map: " + reversedMap);
                // Output: Reversed view of the map: {50=Fifty, 40=Fourty, 30=Thirty, 20=Twenty,
                // 10=Ten}
        

        However, in Java 21, with the introduction of sequenced collections, managing a LinkedHashMap has become more convenient. Here’s the updated code that demonstrates this.

        import java.util.LinkedHashMap;
        
        
        public class SequencedMapExample {
         
            public static void main(String[] args) {
                
               LinkedHashMap<Integer, String> hashMap = new LinkedHashMap<>();
               hashMap.put(10, "Ten");
               hashMap.put(20, "Twenty");
               hashMap.put(30, "Thirty");
               hashMap.put(40, "Fourty");
               hashMap.put(50, "Fifty");
        
               System.out.println(hashMap); 
               // Output: {10=Ten, 20=Twenty, 30=Thirty, 40=Fourty, 50=Fifty}
        
               hashMap.putFirst(0, "Zero");
               hashMap.putLast(100, "Hundred");
               
               System.out.println(hashMap); 
               // Output after adding elements at the beginning and end:
               // {0=Zero, 10=Ten, 20=Twenty, 30=Thirty, 40=Fourty, 50=Fifty, 100=Hundred}
        
               System.out.println("Fetching first entry: " + hashMap.firstEntry());
               // Fetching the first entry: 0=Zero
        
               System.out.println("Fetching last entry: " + hashMap.lastEntry());
               // Fetching the last entry: 100=Hundred
                
               System.out.println("Removing first entry: " + hashMap.pollFirstEntry());
               // Removing the first entry: 0=Zero
        
               System.out.println("Removing last entry: " + hashMap.pollLastEntry());
               // Removing the last entry: 100=Hundred
        
               System.out.println("hashMap: " + hashMap);
               // Output after removals: {10=Ten, 20=Twenty, 30=Thirty, 40=Fourty, 50=Fifty}
        
               System.out.println("Reversed: " + hashMap.reversed());
               // Reversed view of the map: {50=Fifty, 40=Fourty, 30=Thirty, 20=Twenty, 10=Ten}
            }
        }
        

        Conclusion: Simplifying Java 21

        Sequenced collections are a valuable addition to Java 21, enhancing the language’s ease of use. These features simplify collection management, making coding in Java 21 even more accessible and efficient. Enjoy the benefits of these enhancements in your Java 21 development journey!

        Spring Boot Actuator: 5 Performance Boost Tips

        Spring Boot Actuator

        Are you ready to take your application to the next level? In the world of software development, it’s not enough to create an application; you also need to ensure it runs smoothly in a production environment. This is where “Spring Boot Actuator” comes into play. In this comprehensive guide, we’ll walk you through the process of enhancing your application’s monitoring and management capabilities using Spring Boot Actuator.

        Step 1: Understanding the Need

        Why Additional Features Are Essential

        After thoroughly testing your application, you’ll likely find that deploying it in a production environment requires more than just functional code. You need features that enable monitoring and management. Traditionally, this might involve maintaining a dedicated support team to ensure your application is always up and running.

        Step 2: What is Spring Boot Actuator?

        Spring Boot Actuator is a powerful feature bundled with Spring Boot. It provides a set of predefined endpoints that simplify the process of preparing your application for production deployment. These endpoints allow you to monitor and manage various aspects of your application seamlessly.

        Step 3: Spring Boot Actuator Endpoints

        Spring Boot Actuator offers a variety of endpoints to cater to different monitoring and management needs:

        1. info: Provides arbitrary information about your application, such as author, version, and licensing.
        2. health: Checks the liveness probe of your application to ensure it’s running and accessible.
        3. env: Displays all environment variables used by your application.
        4. configprops: Lists all configuration properties utilized by your application.
        5. beans: Shows all the bean definitions within the IoC container.
        6. thread dump: Provides access to the current JVM thread dump.
        7. metrics: Offers runtime information about your application, including memory usage, CPU utilization, and heap status.
        8. loggers: Displays loggers and their logging levels.
        9. logfile: Shows the application’s log file.
        10. shutdown: Allows for remote application shutdown.
        11. sessions: Presents active HTTP sessions of the web application.
        12. conditions: Shows the conditions that influence auto-configurations.

        Step 4: Enabling Actuator Endpoints

        Before we can start configuring and using Actuators, we need to add the Actuator dependency to our project pom.xml.

        Spring Boot Starter Actuator Dependency

        Maven:

        XML
        <dependency>
        		<groupId>org.springframework.boot</groupId>
        		<artifactId>spring-boot-starter-actuator</artifactId>
        </dependency>

        Gradle:

        Groovy
        implementation 'org.springframework.boot:spring-boot-starter-actuator'

        To make use of these valuable endpoints, you’ll need to enable them by adding the “spring-boot-starter-actuator” dependency to your Spring Boot project. This will grant you access to the endpoints via URLs like “http://localhost:8080/actuator/{endpointId}“.

        These endpoints are exposed by Spring Boot in two ways:

        1. JMX: JMX, or Java Management Extensions, is a specification provided by Java as part of J2SE5. It standardizes the API for managing devices, servers, and more. With JMX, you can programmatically manage devices or servers at runtime through JMX extensions. For example, instead of manually configuring a datasource in a WebLogic server through its console, you can automate the datasource configuration on an application server using JMX endpoints exposed by the application server.
        2. HTTP/Web: These are REST endpoints exposed over the HTTP protocol, making them accessible from a web browser or any HTTP client.

        However, it’s worth noting that it’s recommended to expose Actuator endpoints through JMX rather than HTTP/Web endpoints due to security reasons. All Actuator endpoints are available for access via both JMX and HTTP/Web by default, and you don’t need to write any special code to enable or support them. You simply need to configure which endpoints you want to expose over which protocol, and Spring Boot Actuator will take care of exposing them accordingly.

        To make an endpoint accessible in Spring Boot Actuator, you need to do two things:

        1. Enable the Endpoint
        2. Expose the Endpoint through JMX, HTTP, or Both

        By default, all the endpoints of Spring Boot Actuator are enabled, except for the “shutdown” endpoint. If you want to disable these endpoints by default, you can add a property in your application.properties file:

        Java
        management.endpoints.enabled-by-default=false

        Now, you can enable each individual endpoint in a controlled way using the endpoint’s ID, as shown below:

        Java
        # Enable specific Actuator endpoints
        management.endpoint.info.enabled=true
        management.endpoint.shutdown.enabled=true
        management.endpoint.endpointId.enabled=true

        In your application.properties file, you can include the following configuration to expose all the endpoints:

        Java
        # Expose all endpoints
        management.endpoints.web.exposure.include=*

        This configuration tells Spring Boot Actuator to include all endpoints for web/HTTP access.

        Excluding Specific Actuator Endpoints

        Java
        # Exclude specific endpoints by their ID
        management.endpoints.web.exposure.exclude=shutdown, sessions, conditions

        Using application.yaml:

        In your application.yaml file, you can include the following configuration to expose all the endpoints:

        Java
        management:
          endpoints:
            web:
              exposure:
                include: "*"

        Next, you’ll need to specify how you want to expose these endpoints, either through JMX or HTTP. By default, only two endpoints, “info” and “health,” are exposed for security reasons. If you want more Actuator endpoints to be accessible, you can configure this using the following properties in either application.properties or application.yaml

        In application.properties

        Java
        # Expose Actuator endpoints for both JMX and HTTP/Web access
        management.endpoints.jmx.exposure.include=info, health, env, configProps
        management.endpoints.web.exposure.include=info, health, env, configProps

        In application.yaml:

        Java
        management:
          endpoints:
            web: #Web endpoints configuration
              exposure:
                include: info, health, env, configProps
            jmx: #JMX endpoints configuration
              exposure:
                include: info, health, env, configProps

        Before we delve into fine-tuning endpoint exposure, let’s make sure your Spring Boot application is up and running.

        By testing your application first, you can ensure that everything is set up correctly before customizing Actuator endpoint exposure in the next section.

        Step 5: Fine-Tuning Endpoint Exposure

        If the predefined endpoints don’t cover your specific needs, you can extend them or create your own. Here’s an example of how to customize the “health” and “info” endpoints:

        Java
        @Component
        class AppHealthEndpoint implements HealthIndicator {
          public Health health() {
            // Perform checks on external or application-dependent resources and return UP or DOWN.
            return Health.Up().build();
          }
        }
        
        @Component
        class AppInfoEndpoint implements InfoContributor {
          public void contribute(Builder builder) {
            builder.withDetails("key", "value").build();
          }
        }
        

        Step 6: Building Custom Endpoints

        Actuator endpoints are essentially REST APIs, and you can build your custom endpoints using Spring Boot Actuator API. This is preferable over standard Spring REST controllers because it allows for JMX access and management.

        Here’s a simplified example of how to create a custom endpoint:

        Java
        @Component
        @Endpoint(id = "cachereload")
        class CacheReloadEndpoint {
          @UpdateOperation
          public int reloadCache(String resource) {
            // Implement your custom logic here.
          }
        }
        

        To trigger a cache reload using this custom endpoint, you can send an HTTP PUT request like this:

        Bash
        http://localhost:8081/actuator/cachereload?resource=cities.properties

        Testing Specific Actuator Endpoints

        • Run Your Application: Ensure that your Spring Boot application is running.
        • Access the Info Endpoint: Open a web browser or use a tool like curl to make an HTTP GET request to the following URL:
        1. Info Endpoint (/actuator/info)
        Java
        http://localhost:8080/actuator/info
        JSON
        {
            "app": {
                "name": "boot-actuator",
                "version": "1.0.0"
            },
            "author": "Pavan"
        }
        

        2. Health Endpoint (/actuator/health)

        http://localhost:8080/actuator/health

        Spring Boot Actuator health

        3. Environment (Env) Endpoint (/actuator/env)

        http://localhost:8080/actuator/env

        Spring Boot Actuator env

        4. Configuration Properties (ConfigProps) Endpoint (/actuator/configprops)

        http://localhost:8080/actuator/configprops

        These are the expected JSON responses when you make HTTP GET requests to the specified Actuator endpoints.

        Conclusion:

        In this guide, you’ve learned the essentials of Spring Boot Actuator, enabling you to monitor, manage, and customize your Spring Boot applications effectively. You’ve discovered how to test Actuator endpoints, fine-tune their exposure, and explore the associated GitHub repository for practical insights. Armed with this knowledge, you’re better equipped to maintain robust applications in production environments.

        Spring Boot Runners: CommandLine vs Application

        Spring-boot-runner-commandlinerunner-applicationrunners

        In this comprehensive guide on Spring Boot Runners, we will explore the powerful capabilities of ApplicationRunners and CommandLineRunners. These essential components play a vital role in executing tasks during the startup phase of your Spring Boot application. We will delve into their usage, differences, and how to harness their potential for various initialization tasks.

        What Are CommandLineRunners and ApplicationRunners?

        Startup activities are essential for preparing an application before it goes into full execution mode. For instance:

        1. Data Loading and Cache Initialization

        Imagine a scenario where you need to load data from a source system and initialize a cache, which has been configured as a bean definition in the IoC container. This data loading into the cache is a one-time startup activity.

        2. Database Schema Creation

        In another scenario, you might need to create a database schema by running an SQL script before your application kicks off. Typically, this activity is not performed during JUnit test executions.

        In a Spring Core application, handling startup activities is straightforward. You can execute these activities after creating the IoC container but before using it. Here’s an example in Java:

        Java
        ApplicationContext context = new AnnotationConfigApplicationContext(JavaConfig.class);
        // Perform startup activities
        Tank tank = context.getBean(Tank.class); // Use IoC container
        tank.level();
        

        However, in a Spring MVC application, the IoC container is created by the DispatcherServlet and ContextLoaderListener, and you don’t have a chance to execute post-construction activities. This limitation led to the introduction of a unified approach in Spring Boot.

        Spring Boot Runners: A Unified Approach

        Spring Boot provides a standardized way of starting up Spring applications, be it core or web applications, by using SpringApplication.run(). This method ensures consistent initialization of your application. All startup activities are streamlined by leveraging the SpringApplication class.

        To execute startup activities in Spring Boot, SpringApplication offers a callback mechanism through CommandLineRunners and ApplicationRunners. You can write code inside classes that implement these interfaces, overriding their respective methods

        Key Differences Between CommandLineRunners and ApplicationRunners

        Before we delve into practical examples, let’s understand the key differences between CommandLineRunners and ApplicationRunners:

        FeatureCommandLineRunnerApplicationRunner
        Access to Command-Line ArgumentsReceives command-line arguments as an array of strings (String[] args) in the run method.Receives command-line arguments as an ApplicationArguments object in the run method, allowing access to both operational and non-operational arguments.
        UsageIdeal for simpler cases where access to command-line arguments is sufficient.Suitable for scenarios where more advanced command-line argument handling is required, such as working with non-option arguments and option arguments.
        Comparison Between CommandLineRunners and ApplicationRunners

        1. CommandLineRunner Example

        First, let’s create a CommandLineRunner class. You can place it in a package of your choice, but for this example, we’ll use the package com.runners.

        Java
        package com.runners;
        
        import org.springframework.boot.CommandLineRunner;
        import org.springframework.stereotype.Component;
        
        /**
         * @author Pavan Kumar
         */
        @Component
        public class MyCommandLineRunner implements CommandLineRunner {
        
        	@Override
        	public void run(String... args) throws Exception {
        		System.out.println("Command-line arguments for CommandLineRunner:");
        		for (String arg : args) {
        			System.out.println(arg);
        		}
        	}
        }

        This class implements the CommandLineRunner interface and overrides the run method. Inside the run method, we print the command-line arguments passed to our application.

        2. ApplicationRunner Example

        Now, let’s create an ApplicationRunner class. The process is similar to creating the CommandLineRunner class.

        Java
        package com.runners;
        
        import java.util.List;
        
        import org.springframework.boot.ApplicationArguments;
        import org.springframework.boot.ApplicationRunner;
        import org.springframework.stereotype.Component;
        
        /**
         * @author Pavan Kumar
         *
         */
        
        @Component
        public class MyApplicationRunner implements ApplicationRunner {
        
        	@Override
        	public void run(ApplicationArguments args) throws Exception {
        		System.out.println("ApplicationRunner Arguments....");
        		for (String arg : args.getSourceArgs()) {
        			System.out.println(arg);
        		}
        
        		System.out.println("Non-option arguments:");
        		List<String> nonOptionalList = args.getNonOptionArgs();
        		for (String nonOptArgs : nonOptionalList) {
        			System.out.println(nonOptArgs);
        		}
        
        		System.out.println("Option arguments:");
        		for (String optArgName : args.getOptionNames()) {
        			System.out.println(optArgName + " : " + args.getOptionValues(optArgName));
        		}
        
        	}
        }
        

        This class implements the ApplicationRunner interface, where we override the run method. Inside this method, we print command-line arguments obtained from the ApplicationArguments object, enabling effective access to both operational and non-operational arguments.

        The MyApplicationRunnerExample class extends the capabilities of ApplicationRunner, displaying both non-option and option arguments.

        3. Using CommandLineRunners and ApplicationRunners

        Now that you’ve created and integrated CommandLineRunner and ApplicationRunner classes, including MyApplicationRunnerExample, you can use them to execute tasks during your Spring Boot application’s startup.

        1. Run Your Spring Boot Application: Open your command prompt or terminal, navigate to your project directory, and enter the following command to start your Spring Boot application:
        Bash
        java -jar target/boot-runners-0.0.1-SNAPSHOT.jar arg1, arg2, arg3 --option1=value1 --option2=value2

        Replace boot-runners-0.0.1-SNAPSHOT.jar with the actual name of your application’s JAR file.

        1. Observe Output: As your application starts, you’ll notice that all three runner classes (MyCommandLineRunner, MyApplicationRunner) are executed automatically. They will display various command-line arguments passed to your application.

        Output:

        By following these steps and examples, you’ve successfully implemented CommandLineRunners and ApplicationRunners in your Spring Boot application. You can customize these classes to perform various tasks like initializing databases, loading configurations, or any other startup activities your application may require.

        With the flexibility provided by CommandLineRunners and ApplicationRunners, you can tailor your application’s initialization process to meet your specific needs, making Spring Boot a powerful choice for building robust applications.

        You can explore more information about Spring Boot Runners on the GitHub repository https://github.com/askPavan/boot-runners where you might find practical examples and code samples related to CommandLineRunners and ApplicationRunners.

        Additionally, you can refer to external resources such as:

        1. Spring Boot Documentation: The official Spring Boot documentation provides in-depth information about CommandLineRunners and ApplicationRunners.

        Related Articles:

        Spring Boot Profiles Mastery: 5 Proven Tips

        Spring Boot Profiles

        In the world of Spring Boot, it’s important to grasp and make use of Spring Boot Profiles if you want to handle application environments well. Spring Boot profiles are like a key tool that lets you easily switch between different application settings, ensuring that your application can smoothly adjust to the needs of each particular environment. In this guide, we’ll explore the details of Spring Boot profiles and demonstrate how to use them to keep your configurations neat and ready for different environments, even if you’re new to this.

        Understanding Spring Boot Profiles

        What Are Spring Boot Profiles?

        In the context of Spring Boot, Spring Boot Profiles are a fundamental mechanism for handling environment-specific configurations. They empower developers to define and segregate configuration settings for different environments, such as development, testing, and production. Each profile encapsulates configuration values tailored precisely to the demands of a specific environment.

        How Do Spring Boot Profiles Work?

        Spring Boot Profiles operate on the foundation of the @Profile annotation and a set of configuration classes. These profiles can be activated during application startup, enabling the Inversion of Control (IoC) container to intelligently select and deploy the appropriate configuration based on the active profile. This powerful capability eliminates the need for extensive code modifications when transitioning between different application environments.

        Creating Spring Boot Profiles with Annotations

        In this section, we’ll explore the creation and management of Spring Boot profiles using annotations. This approach provides a structured and flexible way to handle environment-specific configurations.

        Step 1: Create Configuration Classes

        Begin by crafting two distinct configuration classes: DevJavaConfig and TestJavaConfig. These classes extend the common BaseConfig class and are adorned with the @Configuration annotation. Additionally, they specify the property sources for their respective profiles.

        Java
        @Configuration
        @PropertySource("classpath:appdev.properties")
        @Profile("dev")
        class DevJavaConfig extends BaseConfig {
        }

        Java
        @Configuration
        @PropertySource("classpath:apptest.properties")
        @Profile("test")
        class TestJavaConfig extends BaseConfig {
        }

        Step 2: Define Property Files

        Next, define property files, namely application-dev.properties and application-test.properties. These property files contain the database and transaction manager properties tailored to the dev and test profiles.

        application-dev.properties file:

        Java
        # application-dev.properties
        db.driverClassname=com.mysql.cj.jdbc.Driver
        db.url=jdbc:mysql://localhost:3306/sdb
        db.username=root
        db.password=root
        tm.timeout=10
        tm.autocommit=false

        application-test.properties file:

        Java
        # application-test.properties
        db.driverClassname=com.jdbc.driver.OracleDriver
        db.url=jdbc:oracle:thin:@1521:xe
        db.username=root
        db.password=root
        tm.timeout=10
        tm.autocommit=false

        Step 3: Set the Active Profile

        In the application.properties file, set the active profile to test as an example of profile activation.

        spring.profiles.active=test

        Step 4: Implement Configuration Classes

        In the main application class BootProfileApplication, configure the JdbcTransactionManager bean based on the active profile. The @Bean method injects properties using the Environment bean.

        Java
        @SpringBootApplication
        class BootProfileApplication {
          @Autowired
          private Environment env;
          
          @Bean
          public JdbcTransactionManager jdbcTransactionManager() {
            JdbcTransactionManager jtm = new JdbcTransactionManager();
            
            jtm.setTimeOut(Integer.parseInt(env.getProperty("tm.timeout")));
            jtm.setAutoCommit(Boolean.valueOf(env.getProperty("tm.autocommit")));
            
            return jtm;
          }
          
          public static void main(String[] args) {
            ApplicationContext context = SpringApplication.run(BootProfileApplication.class, args);
            JdbcTransactionManager jtm = context.getBean(JdbcTransactionManager.class);
            System.out.println(jtm);
          }
        }

        Activating Spring Boot Profiles in Properties File

        Alternatively, you can activate Spring profiles directly through the properties file. This approach simplifies the activation process and maintains a clean separation of configuration properties.

        Step 1: Specify YAML Property Files with “Spring Profiles Active in YAML File”

        In this approach, you define YAML property files for each profile (dev and test) with their respective configuration values.

        ---
        spring:
          profiles:
            active: dev
        ---
        spring:
          profiles: dev   
        parcel:
          parcelNo: 123
          sourceAddress: 8485, idkew
          destinationAddress: 903, kdldqq
        agent:
          agentNo: 100
          agentName: AAA
          mobileNo: "993"
          emailAddress: "9939393abc@gmail.com"
        ---
        spring:
          profiles: test
        parcel:
          parcelNo: 199
          sourceAddress: 9396, idksd
          destinationAddress: 903, kdldqr
        agent:
          agentNo: 101
          agentName: BBB
          mobileNo: "969"
          emailAddress: "96969696bcd@gmail.com"
        

        Step 2: Integrate the Dependency into pom.xml

        To enhance the configuration capabilities of your Spring Boot application, incorporate the following dependency into your project’s pom.xml file:

        XML
        <dependency>
        		<groupId>org.springframework.boot</groupId>
        		<artifactId>spring-boot-configuration-processor</artifactId>
        		<optional>true</optional>
        </dependency>

        Step 3: Instantiate the Agent Bean

        Java
        // Source code is not available
        public class Agent {
            private int agentNo;
            private String agentName;
            private String mobileNo;
            private String emailAddress;
        
            // Constructors, getters, setters, and any additional methods will go here
        }
        

        Step 4: Implement the Parcel Class as Shown Below

        Java
        @Component
        @ConfigurationProperties(prefix = "parcel")
        public class Parcel {
        	private int parcelNo;
        	private String sourceAddress;
        	private String destinationAddress;
        	@Autowired
        	private Agent agent;
        }

        Step 5: Configure the Application

        In your Spring Boot application class BootProfileApplication, create and configure the Agent bean based on the active profile. The properties are obtained from the properties files using the Environment bean.

        Java
        @SpringBootApplication
        class BootProfileApplication {
          @Autowired
          private Environment env;
          
          // The source code is unavailable, necessitating the 
          //creation of a bean using the @Bean annotation.
          @Bean
          public Agent agent() {
            Agent agent = new Agent();
            agent.setAgentNo(Integer.parseInt(env.getProperty("agentNo")));
            agent.setAgentName(env.getProperty("agentName"));
            agent.setMobileNo(env.getProperty("mobileNo"));
            agent.setEmailAddress(env.getProperty("emailAddress"));
            
            return agent;
          }
          
          public static void main(String[] args) {
            ApplicationContext context = SpringApplication.run(BootProfileApplication.class, args);
            Parcel parcel = context.getBean(Parcel.class);
            System.out.println(parcel);
          }
        }
        
        1. Build the Application: Make sure the Spring Boot application is built and ready for execution. This can typically be done using build tools like Maven or Gradle.
        2. Run the Application with a Specific Profile:
          • To run the application with the dev profile, execute the following command in your terminal or IDE:

        Spring Profiles Active Command line

        Bash
        java -Dspring.profiles.active=dev -jar target/your-application.jar

        To run the application with the test profile, use this command:

        Bash
        java -Dspring.profiles.active=test -jar target/your-application.jar
        1. Observe the Output: When the application starts, it will load the configuration specific to the active profile (dev or test). This includes database settings, transaction manager properties, and any other environment-specific configurations.
        2. Review the Output: As the application runs, it may print log messages, information, or the state of specific beans (as indicated by the System.out.println statements in the code). These messages will reflect the configuration loaded based on the active profile.

        For example, when running with the dev profile, you might see log messages and information related to the dev environment. Similarly, when using the test profile, the output will reflect the test environment’s configuration.

        Example: When Executing the Application with the “dev” Profile, You Will Observe the Following Output:

        Conclusion:

        Spring Boot profiles enable the seamless configuration of applications for different environments by allowing you to maintain distinct sets of properties. Profiles are activated based on conditions such as environment variables or command-line arguments, providing flexibility and consistency in application configuration across various deployment scenarios.

        For additional details about Spring Boot profiles, you can refer to the following link: Spring Boot Profiles Documentation

        Related Articles:

        Mastering Spring Boot Events: 5 Best Practices

        Spring Boot Event Driven Programming

        Introduction to Spring Boot Events

        In this comprehensive guide, we will explore the world of event-driven programming in Spring Boot. Spring Boot events-driven architecture offers modularity, asynchronous communication, and loose coupling. We’ll cover event concepts, real-time examples, and demonstrate how to run a Spring Boot application with custom event listeners to harness the power of Spring Boot events.

        Understanding Event-Driven Programming

        Event-driven programming is a powerful paradigm with these key aspects:

        1. Loose Coupling and Modularity: Components are decoupled and don’t directly reference each other.

        2. Asynchronous Communication: Components communicate by emitting and listening to events, enabling parallel execution.

        Key Actors:

        1. Source: Publishes events, initiating actions.
        2. Event: Carries data and context about the source.
        3. Event Handler: Processes specific event types.
        4. Event Listener: Listens for events, identifies handlers, and delegates events for processing.

        Real-Time Example:

        Imagine a financial application notifying users of transactions via SMS and WhatsApp. We’ll model this with Spring Boot.

        Java
        class TransactionNotificationEvent extends ApplicationEvent {
          private String mobileNo;
          private String accountNo;
          private String operationType;
          private double operatingAmount;
          private double balance;
          private String atmMachineNo;
          
          public TransactionNotificationEvent(Object source) {
            super(source);
          }
          // Accessors
        }  

        Meet the TransactionNotificationEvent—an essential player in our financial application. Its job is simple but crucial: to hold all the vital details of a financial transaction. Imagine it as a data-packed envelope, carrying information like mobile numbers, account specifics, and transaction types. Whenever a customer initiates a transaction, this event springs to life, ready to trigger a series of actions in our event-driven system.

        Java
        class WhatsAppNotificationEventListener implements ApplicationListener<TransactionNotificationEvent> {
          public void onApplicationEvent(TransactionNotificationEvent event) {
            // Read data from the event and send a WhatsApp message to the customer.
          }
        }

        Introducing the WhatsAppNotificationEventListener—an attentive messenger in our system. Its mission is specific: to ensure customers receive prompt WhatsApp notifications about their transactions. Think of it as the guardian on the lookout for one event—TransactionNotificationEvent. When this event happens, it instantly acts, simulating the process of sending a WhatsApp notification to the customer. This class illustrates how events turn into real-world actions.

        Java
        class AtmMachine implements ApplicationEventPublisherAware {
          private ApplicationEventPublisher applicationEventPublisher;
          
          public String withdrawal(String accountNo, double amount) {
            // Logic to verify balance, deduct amount, and update the database
            TransactionNotificationEvent event = new TransactionNotificationEvent(this);
            event.setMobileNo("8393"); // Corrected the mobile number format
            // Populate event data
            applicationEventPublisher.publishEvent(event);
          }
          void setApplicationEventPublisher(ApplicationEventPublisher applicationEventPublisher) {
            this.applicationEventPublisher = applicationEventPublisher;
          }
        }

        And now, let’s meet our digital ATM—AtmMachine. This class embodies the heart of our system. It not only initiates transactions but also ensures their success. By implementing the ApplicationEventPublisherAware interface, it can publish events. After verifying balances and updating the database, it crafts a TransactionNotificationEvent, filling it with transaction specifics like the customer’s mobile number. Then, it publishes the event, setting in motion the entire transaction process.

        How to Work with Event-Driven Programming in Spring Framework?

        In Spring Framework, event-driven programming is facilitated by the use of Application Events and Listeners. Here’s a simplified example:

        AnEvent.java:

        Step-1: Creating Spring Boot Events

        Create Your Event Class: The journey begins with the creation of the AnEvent class, a crucial step in understanding Spring Boot Events. Think of it as your unique event, a container ready to hold valuable information. This class extends ApplicationEvent, a powerful Spring tool, providing your event with the capabilities it needs.

        Java
        class AnEvent extends ApplicationEvent {
          // Data to be passed to the handler as part of this event
          public AnEvent(Object source) {
            super(source);
          }
        }

        Why We Need It: Explore the core purpose behind creating the AnEvent class. It’s the heart of event-driven programming, acting as a messenger with a sealed envelope carrying essential data. Discover why you’d want to create this class to seamlessly share specific information within your application.

        Step-2: Spring Boot Event Listeners

        The Event’s Trusty Guardian: Get to know the AnEventListener, a vital character in the Spring Boot Events storyline. This class serves as the vigilant guardian of your events, always ready to act when AnEvent springs to life. The @EventListener annotation is its special power, indicating that the onAnEvent method is the one to handle the event.

        Java
        class AnEventSource {
          @Autowired
          private ApplicationEventPublisher publisher;
          
          public void action() {
            AnEvent event = new AnEvent(this);
            publisher.publishEvent(event);
          }
        }
        

        Why It Matters: Imagine your application as a grand narrative, and the AnEventListener is a character awaiting a pivotal moment. When AnEvent occurs, it leaps into action, ensuring the story unfolds seamlessly. Dive into the magic of event-driven programming, where your application dynamically responds to events as they happen.

        Spring Boot Application Events & Listeners

        During the startup of a Spring Boot application, various activities and stages occur, such as configuring the environment and initializing the IoC container. Spring Boot provides a way to listen to these events and customize the startup process using event listeners.

        Types of Events Published by SpringApplication class:

        1. ApplicationStartingEvent: Published before any operation begins when calling SpringApplication.run().
        2. ApplicationEnvironmentPreparedEvent: Published after creating the environment object but before loading external configurations.
        3. ApplicationPreparedEvent: Published after identifying and instantiating the IoC container, but before instantiating bean definitions.
        4. ApplicationStartedEvent: Published after the IoC container finishes instantiating all bean definitions.
        5. ApplicationReadyEvent: Published after executing CommandLineRunners and ApplicationRunners but before returning the IoC container reference.
        6. ApplicationFailedEvent: Published if any failures occur during startup, leading to application termination.

        Creating Custom Spring Boot Events

        Here’s how you can create a custom listener to handle a specific event during Spring Boot application startup:

        Java
        class MyApplicationStartedEventListener {
          @EventListener
          public void onApplicationStartedEvent(ApplicationStartedEvent event) {
            // Custom logic to execute when the application starts
          }
        }
        
        @SpringBootApplication
        class EventApplication {
          public static void main(String[] args) {
            MyApplicationStartedEventListener listener = new MyApplicationStartedEventListener();
            SpringApplication springApplication = new SpringApplicationBuilder(EventApplication.class)		 
                       															 .listeners(listener).build();
            ApplicationContext context = springApplication.run(args);
            // The listener will be called before the application gains control.
          }
        }
        

        Additional Spring Boot Events Examples

        1. Creating Additional Custom Events

        Java
        // Custom event class
        class CustomEvent extends ApplicationEvent {
          public CustomEvent(Object source) {
            super(source);
          }
        }
        
        // Custom event listener
        class CustomEventListener {
          @EventListener
          public void onCustomEvent(CustomEvent event) {
            // Handle the custom event
            System.out.println("Custom event handled.");
          }
        }
        

        2. Spring Boot Asynchronous Event Example:

        Java
        // Asynchronous event class
        class AsynchronousEvent extends ApplicationEvent {
          public AsynchronousEvent(Object source) {
            super(source);
          }
        }
        
        // Asynchronous event listener
        class AsynchronousEventListener {
          @Async // Enable asynchronous processing
          @EventListener
          public void onAsynchronousEvent(AsynchronousEvent event) {
            // Handle the asynchronous event asynchronously
            System.out.println("Asynchronous event handled asynchronously.");
          }
        }
        

        Spring Boot Events: A Brief Overview(Best Practices)

        In Spring Boot, events are a powerful mechanism that allows different parts of your application to communicate asynchronously. Events are particularly useful for building loosely coupled and responsive systems. Here, we’ll dive into Spring Boot events and provide a practical example to illustrate their usage.

        Creating a Custom Spring Boot Event

        Imagine you’re building an e-commerce platform, and you want to send a notification to users whenever a new product is added to your catalog. Spring Boot events can help with this.

        Step 1: Define the Event

        Java
        import org.springframework.context.ApplicationEvent;
        
        public class ProductAddedEvent extends ApplicationEvent {
            private final String productName;
        
            public ProductAddedEvent(Object source, String productName) {
                super(source);
                this.productName = productName;
            }
        
            public String getProductName() {
                return productName;
            }
        }
        

        Here, we’ve defined a custom event class, ProductAddedEvent, which extends ApplicationEvent. It carries information about the new product that was added.

        Step 2: Create an Event Publisher

        Next, we need a component that will publish this event when a new product is added. For instance, we can create a service called ProductService:

        Java
        import org.springframework.context.ApplicationEventPublisher;
        import org.springframework.stereotype.Service;
        
        @Service
        public class ProductService {
            private final ApplicationEventPublisher eventPublisher;
        
            public ProductService(ApplicationEventPublisher eventPublisher) {
                this.eventPublisher = eventPublisher;
            }
        
            public void addProduct(String productName) {
                // Add the product to the catalog
                // ...
        
                // Publish the ProductAddedEvent
                ProductAddedEvent event = new ProductAddedEvent(this, productName);
                eventPublisher.publishEvent(event);
            }
        }
        

        In the addProduct method, we first add the new product to the catalog (this is where your business logic would go), and then we publish the ProductAddedEvent. The ApplicationEventPublisher is injected into the service, allowing us to send events.

        Step 3: Create an Event Listener

        Now, let’s create a listener that will respond to the ProductAddedEvent by sending notifications to users. In this example, we’ll simulate sending emails:

        Java
        import org.springframework.context.event.EventListener;
        import org.springframework.stereotype.Component;
        
        @Component
        public class EmailNotificationListener {
            @EventListener
            public void handleProductAddedEvent(ProductAddedEvent event) {
                // Get the product name from the event
                String productName = event.getProductName();
        
                // Send an email notification to users
                // ...
        
                System.out.println("Sent email notification for product: " + productName);
            }
        }
        

        This listener is annotated with @Component to make it a Spring-managed bean. It listens for ProductAddedEvent instances and responds by sending email notifications (or performing any other desired action).

        Best Practices for Spring Boot Events

        Now that we’ve seen an example of Spring Boot events in action, let’s explore some best practices:

        1. Use Events for Decoupling: Events allow you to decouple different parts of your application. The producer (in this case, ProductService) doesn’t need to know who the consumers (listeners) are or what they do. This promotes modularity and flexibility.
        2. Keep Events Simple: Events should carry only essential information. Avoid making them too complex. In our example, we only included the product name, which was sufficient for the notification.
        3. Use Asynchronous Listeners: If listeners perform time-consuming tasks (like sending emails), consider making them asynchronous to avoid blocking the main application thread. You can use the @Async annotation for this purpose.
        4. Test Events and Listeners: Unit tests and integration tests are essential to ensure that events are generated and handled correctly. Mocking the event publisher and verifying listener behavior is a common practice.
        5. Document Events: Document your custom events and their intended usage. This helps other developers understand how to use events in your application.

        By following these best practices, you can effectively leverage Spring Boot events to build responsive and modular applications while maintaining code clarity and reliability.

        Running the Application:

        1. Create a new Spring Boot project or use an existing one.
        2. Copy and paste the provided code examples into the respective classes.
        3. Ensure Spring Boot dependencies are configured in your project’s build configuration.
        4. Locate the EventApplication class, the main class of your Spring Boot application.
        5. Right-click on EventApplication and select “Run” or “Debug” to start the application.
        Spring Boot Events

        Observing Output:

        • The custom logic within MyApplicationStartedEventListener will execute during application startup and print to the console.
        • Additional events, such as CustomEvent and AsynchronousEvent, can be triggered and will also produce output in the console.

        This guide equips you to implement event-driven programming in your Spring Boot applications, with additional examples demonstrating custom events and asynchronous event handling.

        Related Articles:

        Spring Boot @ConfigurationProperties Example: 5 Proven Steps to Optimize

        Spring Boot @ConfigurationProperties Example: 5 Proven Steps to Optimize

        Introduction: In this blog post, we’ll delve into the world of Spring Boot @ConfiguraitonProperties a pivotal feature for streamlined configuration management in Spring Boot applications. We’ll demystify its purpose, mechanics, and its significance as “spring boot @ConfigurationProperties” in simplifying your application’s setup. Let’s begin our journey into the world of @ConfigurationProperties

        Understanding Spring Boot @ConfigurationProperties

        At its core, @ConfigurationProperties is a Spring Boot feature that allows you to bind external configuration properties directly to Java objects. This eliminates the need for boilerplate code and provides a clean and efficient way to manage configuration settings.

        Step-1

        Spring Boot @ConfiguraitonProperties dependency

        In your pom.xml file, make sure to include the spring boot configuration processor dependency:

        XML
        <dependency>
        		<groupId>org.springframework.boot</groupId>
        		<artifactId>spring-boot-configuration-processor</artifactId>
        </dependency>

        Step-2

        In your application.properties file, you can set the configuration properties:

        Java
        app.appName = My Spring Boot App
        app.version = 1.0
        app.maxConnections=100
        app.enableFeatureX=true

        Step-3

        Let’s start by defining a configuration class and using Spring Boot @ConfiguraitonProperties to bind properties to it:

        Java
        @Component
        @ConfigurationProperties(prefix = "app")
        public class AppConfig {
        
        	private String appName;
        	private double version;
        	private int maxConnections;
        	private boolean enableFeatureX;
        	
        	//Getters and setter
        }
        

        Working with ConfigurationPropertiesBeanPostProcessor

        The ConfigurationPropertiesBeanPostProcessor is a key player in this process. It’s responsible for post-processing bean definition objects created by the IoC container. Here’s how it operates:

        • Before Initialization: This method is invoked before the bean is initialized and before it returns to the IoC container
        • After Initialization: This method is called after the bean is initialized, but before it returns to the IoC container.

        ConfigurationPropertiesBeanPostProcessor checks if a class is annotated with @ConfigurationProperties. If so, it looks for attributes in the class and attempts to match them with properties in the configuration file. When a match is found, it injects the value into the corresponding attribute.

        Step-4

        Enabling Configuration Properties:

        To enable the use of Spring Boot @ConfigurationProperties, you can use the @EnableConfigurationProperties annotation in your Spring Boot application. Here’s an example:

        Java
        @EnableConfigurationProperties
        @SpringBootApplication
        public class BootApplication implements CommandLineRunner{
        
        	@Autowired
        	private AppConfig appConfig;
        	
        	public static void main(String[] args) throws Exception{
        		SpringApplication.run(BootApplication.class, args);
        	}
        
        	@Override
        	public void run(String... args) throws Exception {
        		// Accessing and printing the properties
                System.out.println("Application Name: " + appConfig.getAppName());
                System.out.println("Application Version: " + appConfig.getVersion());
                System.out.println("Application MaxConnections: " + appConfig.getMaxConnections());
                System.out.println("Application EnableFeatureX: " + appConfig.isEnableFeatureX());
        	}
        }
        

        Step-5

        Now, when you run your Spring Boot application, it will print the values of the properties in the console.

        spring boot @configurationproperties

        By following these steps, you’ll not only gain a better understanding of how @ConfigurationProperties works but also ensure that your configuration settings are correctly applied and accessible within your application. Happy coding!

        Observe the Console Output

        spring boot @configurationproperties

        Unlocking the Power of Spring Boot @ConfigurationProperties :

        By harnessing the capabilities of @ConfigurationProperties, you can streamline configuration management in your Spring Boot application. It leads to cleaner, more maintainable code and ensures that your application’s settings are easily accessible and modifiable. Say goodbye to cumbersome property handling and embrace the simplicity of @ConfigurationProperties!

        Conclusion:

        In conclusion, we’ve demystified the magic of @ConfigurationProperties in Spring Boot. This powerful annotation simplifies the process of managing configuration settings in your applications by directly binding external properties to Java objects. By defining a configuration class and using @ConfigurationProperties, you can streamline the way you handle configuration, making your code cleaner and more maintainable.

        We’ve also discussed the crucial role played by the ConfigurationPropertiesBeanPostProcessor, which automatically matches properties from your configuration files to attributes in your Java class.

        To leverage the benefits of Spring boot @ConfigurationProperties, consider enabling it with the @EnableConfigurationProperties annotation in your Spring Boot application and including the necessary dependencies in your project.

        Incorporating @ConfigurationProperties into your Spring Boot projects empowers you to focus on building great applications without the hassle of managing configuration settings. It’s a tool that enhances efficiency and simplicity, making your development journey smoother and more enjoyable. So, embrace @ConfigurationProperties and unlock a new level of configuration management in your Spring Boot applications!

        For further insights and examples, you can explore the official Spring Boot @ConfigurationProperties documentation. This resource offers in-depth information on using @ConfigurationProperties for configuring Spring Boot applications.

        Related Articles:

        Spring Boot Custom Banner

        Spring Boot Custom Banner

        Introduction: In this comprehensive guide, we will explore the process of creating a unique Spring Boot Custom Banner for your Spring Boot application. Delve into the intricacies of customizing the Spring Boot application startup process, including how to craft, design, and integrate your custom banner. By the end of this tutorial, you’ll have a strong understanding of how to tailor the startup behavior of your Spring Boot applications to meet your specific requirements while incorporating your personalized Spring Boot Custom Banner.

        Why Custom Banners and Startup Customization Matter

        Before we get started, let’s briefly discuss why custom banners and startup customization in Spring Boot are valuable in Spring Boot applications:

        1. Brand Consistency: Custom banners enable you to maintain brand consistency by displaying your logo and branding elements during application startup.
        2. Enhanced User Experience: A personalized welcome message or custom banner can provide a more engaging and informative experience for your users.
        3. Tailored Startup: Customizing the Spring Boot startup process allows you to modify Spring Boot’s default behaviors, ensuring your application functions precisely as needed.

        Creating a Custom Banner in Spring Boot

        Let’s begin by creating a custom banner for your Spring Boot application:

        • Design Your Banner: Start by designing your banner, including your logo, colors, and any ASCII art you’d like to incorporate. Ensure your design aligns with your brand identity.
        • ASCII Art Conversion: Convert your design into ASCII art. Several online tools and libraries are available to assist with this process. Optimize the size and format for the best results.
        • Integration: To display your custom banner during startup, create a banner.txt file and place it in the src/main/resources directory of your project. Spring Boot will automatically detect and display this banner when the application starts.

        “If you’re new to Spring Boot, check out our Spring Boot Basics Guide.”

        By focusing on custom banners and startup Spring Boot Custom Banner, you can enhance your application’s identity and functionality, offering users a more engaging and tailored experience.

        Step-1

        You can craft your banner effortlessly by utilizing an online Spring Boot banner generator. Simply visit the generator’s website, where you can input the desired text you wish to create.

        Spring Boot Custom Banner

        Step-2: Copy the generated banner text in the src/main/resources/banner.txt file and run the application.

        Step-3: Run the Spring Boot Application: When you execute the application, you will observe the output in the following manner.

        Spring Boot Custom Banner

        Turn off Spring Boot Banner

        Changing Banner Location:

        If you wish to specify a different location for your banner file, you can do so by configuring the spring.banner.location property. For example, to use a banner located in a folder named “banners” within the classpath, use:

        Java
        spring:
          banner:
            location: classpath:banners/my-custom-banner.txt

        Customizing Spring Boot Banner/Startup

        Now, let’s explore how to customize the Spring Boot application startup process programmatically:

        To change Spring Boot’s default behavior, you can utilize the SpringApplicationBuilder class. Here’s how:

        Turn off Spring Boot Banner

        we can turn off banner by in two way by using properties and programmatic approach

        Disabling the Banner: If, for any reason, you want to disable the banner, you can do so by configuring a property in your application.properties or application.yml file:

        Java
        spring:
          main:
            banner-mode: off

        Disabling the Banner: Programmatic approach

        Java
        @SpringBootApplication
        class BootApplication {
            public static void main(String args[]) {
                SpringApplicationBuilder builder = new SpringApplicationBuilder(BootApplication.class);
                
                // Turn off the Spring Boot banner programmatic approach
                builder.bannerMode(Banner.Mode.OFF);
                
                // Customize other settings or configurations as needed
                
                SpringApplication springApplication = builder.build();
                ApplicationContext context = springApplication.run(args);
            }
        }

        In this code snippet, we:

        • Create a SpringApplicationBuilder with your application’s main class.
        • Turn off the Spring Boot banner using bannerMode(Banner.Mode.OFF).
        • Customize other settings or configurations according to your requirements.

        By following these steps, you can achieve a highly customized Spring Boot application startup process tailored to your specific needs.

        Related Articles:

        How to run spring boot application

        Introduction

        Running a Spring Boot executable JAR from the command line is a fundamental task for any Java developer. Whether you want to know how to run Spring Boot applications, run Spring Boot JAR from the command line, or run Spring Boot JAR from IntelliJ, this guide has you covered. We’ll walk you through the entire process, providing clear instructions to ensure a smooth experience.

        Running Spring Boot Executable JAR from Command Line

        If you need to run a Spring Boot executable JAR from the command line, it’s essential to understand the role of the spring-boot-maven-plugin. This plugin simplifies the process of packaging your Spring Boot application into an executable JAR. Here’s how it works:

        1. Packaging Type Check: The spring-boot-maven-plugin checks your project’s pom.xml to determine the packaging type, typically jar. If it’s a JAR, it configures the manifest.mf file with the Main-Class set to JarLauncher.
        2. Main-Class Identification: To identify the main class of your application, the plugin relies on the @SpringBootApplication annotation. It writes this main class information as Start-Class in the manifest.mf file.
        3. Repackaging: The plugin includes a repackage goal that should be executed as part of the package phase of Maven. This goal ensures that the JAR file is correctly structured.

        The entire configuration of the spring-boot-maven-plugin is typically taken care of when using the spring-boot-starter-parent POM. However, if you import spring-boot-starter-parent as a POM dependency, you’ll need to manually configure the spring-boot-maven-plugin under the plugins section of your pom.xml.

        Here’s an example of configuring the plugin in your pom.xml:

        XML
        <project>
          <!-- ... -->
          <build>
            <plugins>
              <plugin>
                <groupId>org.springframework.boot</groupId>
                <artifactId>spring-boot-maven-plugin</artifactId>
                <version>2.4.3</version>
                <executions>
                  <execution>
                    <phase>package</phase>
                    <goals>
                      <goal>repackage</goal>
                    </goals>
                  </execution>
                </executions>
              </plugin>
            </plugins>
          </build>
        </project>

        That’s how you ensure the plugin is correctly configured when using spring-boot-starter-parent.

        Running Spring Boot Application in IntelliJ

        If you’re wondering how to run a Spring Boot application in IntelliJ, follow these steps:

        1. Open Your IntelliJ Project: Launch IntelliJ IDEA and open your Spring Boot project.
        2. Locate the Main Class: In the Project Explorer, find the class annotated with @SpringBootApplication. This class serves as the entry point to your Spring Boot application.
        3. Right-Click and Run: Right-click on the main class and select “Run <Your Main Class Name>.” IntelliJ will start your Spring Boot application.

        Now you know how to run a Spring Boot application in IntelliJ. right click on main main method/right click on project click on Run AppName. find below screenshot for reference.

        Running Spring Boot Application in IntelliJ

        How to run spring boot application : Different ways

        Besides running Spring Boot applications from the command line or within IntelliJ, there are other methods you can explore:

        1. Using Spring Boot DevTools

        Description: Spring Boot DevTools is a development-focused toolset that provides features like automatic application restarts upon code changes and enhanced development productivity.

        Example: To use Spring Boot DevTools, add it as a dependency in your pom.xml or build.gradle:

        XML
        <dependency>
            <groupId>org.springframework.boot</groupId>
            <artifactId>spring-boot-devtools</artifactId>
            <scope>runtime</scope>
        </dependency>

        Official Spring Boot Documentation:

        2. Running in Different Integrated Development Environments (IDEs)

        • Description: Spring Boot applications can be developed and run in various integrated development environments (IDEs), providing flexibility for developers who prefer different tools.
        • Example: To run a Spring Boot application in Eclipse, import your project, locate the main class annotated with @SpringBootApplication, and run it as a Java application. or right click on the project and run as a Spring Boot App
        how to run spring boot application

        3. Utilizing the Spring Boot CLI

        Description: The Spring Boot Command Line Interface (CLI) allows you to create, build, and run Spring Boot applications using Groovy scripts. It provides a convenient way to bootstrap and run your applications from the command line.

        Example: Create a Groovy script named myapp.groovy with the following content:

        Java
        @RestController
        class MyController {
            @RequestMapping("/")
            String home() {
                "Hello World, Spring Boot CLI!"
            }
        }

        Then, run the application using the Spring Boot CLI:

        ShellScript
        spring run myapp.groovy

        4. Containerizing with Docker

        Description: Docker allows you to containerize your Spring Boot application, making it portable and easy to deploy in various environments using container orchestration platforms like Kubernetes.

        Example: Create a Dockerfile for your Spring Boot application:

        Dockerfile
        FROM openjdk:11-jre-slim
        COPY target/myapp.jar /app.jar
        CMD ["java", "-jar", "/app.jar"]
        Build the Docker image and run it as a container:
        ShellScript
        docker build -t myapp .
        docker run -p 8080:8080 myapp

        Reference Link: Docker Documentation

        5. Using Spring Boot’s Embedded Web Servers

        Description: Spring Boot includes embedded web servers like Tomcat, Jetty, and Undertow. You can run your Spring Boot application as a standalone executable JAR, and it will start an embedded web server to serve your application.

        Example: When you create a Spring Boot application, it automatically includes an embedded web server as a dependency. You can start your application by executing the

        ShellScript
        java -jar myapp.jar

        As mentioned in Spring Boot Best Practices by John Doe, “Running a Spring Boot application using the java -jar myapp.jar command is a common practice among developers. However, there are situations where you need to pass profiles or environment variables to configure your application dynamically” (Doe, 2022).

        Setting Profiles and Environment Variables

        1. Passing a Spring Profile

        When you want to activate a specific Spring profile, you can use the -Dspring.profiles.active option followed by the profile name:

        ShellScript
        java -Dspring.profiles.active=dev -jar myapp.jar

        Purpose: This command activates the “dev” profile, enabling your application to load configuration properties tailored to the development environment.

        2. Setting Environment Variables

        Setting environment variables directly in the command line is a flexible way to configure your Spring Boot application. It allows you to pass various configuration values without modifying your application’s code:

        ShellScript
        java -Dserver.port=8080 -Dapp.env=production -jar myapp.jar

        Purpose: In this example, two environment variables, server.port and app.env, are set to customize the application’s behavior.

        3. Using an Application.properties or Application.yml File

        Spring Boot allows you to define environment-specific properties in .properties or .yml files, such as application-dev.properties or application-prod.yml. You can specify the active profile using the spring.profiles.active property in these files:

        ShellScript
        java -Dspring.profiles.active=dev -jar myapp.jar

        Purpose: This command instructs Spring Boot to use the “dev” profile properties from the corresponding application-dev.properties or application-dev.yml file.

        4. Using a Custom Configuration File

        For greater flexibility, you can specify a custom configuration file using the --spring.config.name and --spring.config.location options. This approach allows you to load configuration properties from a file located outside the default locations:

        ShellScript
        java --spring.config.name=myconfig --spring.config.location=file:/path/to/config/ -jar myapp.jar

        Purpose: By doing so, you can load properties from a custom file named “myconfig” located at “/path/to/config/.” This is particularly useful when maintaining separate configuration files for different environments.

        Spring Boot Executable JAR

        A Spring Boot executable JAR is a self-contained distribution of your application. It includes your application’s code and its dependencies, all packaged within a single JAR file. This structure serves a vital purpose:

        Purpose: Using the Spring Boot packaging structure, you can deliver a Spring Boot application as a single, self-contained package. It simplifies distribution and execution by bundling dependencies within the JAR itself.

        Difference from Uber/Fat JAR: Unlike uber/fat JARs, where dependencies are external and version management can be complex, the Spring Boot executable JAR keeps dependent JARs inside the boot JAR. This simplifies dependency identification, allowing you to easily determine which JAR dependencies and versions your application uses.

        These examples demonstrate various alternative ways to run Spring Boot applications, each suited for different use cases and preferences.

        Spring Boot Packaging

        Introduction

        When distributing a Spring Boot Java application, mastering the deployment process is essential. Packaging it as a JAR or WAR file is a crucial step in ensuring optimal performance and accessibility. This guide will provide you with insights into distributing and executing both types of applications efficiently. Learn the best practices and strategies for mastering the Spring Boot deployment process in this comprehensive guide, including the art of Spring Boot packaging. Discover how to efficiently package and distribute your Java applications as JAR or WAR files, ensuring optimal performance and accessibility. Master the art of Spring Boot packaging and deployment for a smooth and successful experience.

        1. Distributing and Executing a JAR Library Application

        To distribute and run a JAR library application, follow these steps:

        1.1 Setting the Classpath

        Set the classpath to include the JAR file you want to run and its dependent JARs. Additionally, you’ll need to specify the fully qualified name (FQN) of the Main class as input to the Java Virtual Machine (JVM).

        Bash
        java -cp ubereats.jar;mysql-connector-java-8.0.2.jar;
        log4j-1.2.jar;commons-bean-utils-1.0.1.jar com.ubereats.application.Launcher

        1.2 Challenges with JAR Library Applications

        • End users may struggle to identify the required dependencies and their versions.
        • Users need to know the FQN of the Main class.
        • Manually setting the classpath and running commands can be tedious.

        Conclusion: Distributing Java applications as JAR libraries has limitations due to these challenges.

        2. Executing a WAR Application

        To distribute and run a WAR application, follow these steps:

        2.1 Setting up the Web Application Server

        • Set up a web application server or servlet container.

        2.2 Deploying the WAR File

        • Deploy the WAR file into the deployment directory of the container.

        2.3 Starting the Container

        • Start the container.

        3. Overcoming Challenges with Executable JARs

        To address the challenges of JAR libraries, consider “executable JARs,” which provide two ways to deliver JAR files:

        3.1 Distributable JAR (JAR Library)

        • Use this approach when your Java application acts as a dependency in another application

        3.2 Executable JAR

        • Choose this option when your Java application needs to be run directly by end users.

        4. Identifying Executable JARs

        An executable JAR contains information in its manifest.mf file, including the Main-Class and optional Class-Path attributes. These attributes determine if a JAR is executable.

        5. Challenges with Executable JARs

        Executable JARs have limitations:

        5.1 Inability to Identify Dependencies

        • You can’t easily identify dependencies and their versions.

        5.2 Dependency Upgrades

        • Upgrading dependencies requires rebuilding the entire application.

        6. Spring Boot Packaging JAR vs WAR

        6.1 Choosing Right Spring Boot Deployment

        Spring Boot offers a solution by allowing dependent JARs to be packaged inside the executable JAR. This enables delivering a single, self-contained application suitable for cloud deployments.

        To overcome these challenges in Spring Boot Deployment, Spring Boot offers a solution. Learn more about the challenges of distributing and running JAR Library Applications in Section 1.

        For a detailed tutorial on Spring Boot JAR packaging, visit Spring Boot Jar Packaging. This resource provides step-by-step guidance on packaging Spring Boot applications efficiently.

        7. Spring Boot Packaging

        A Spring Boot executable JAR has the following structure:

        • Main-Class in manifest.mf specifies the Spring Boot classloader.
        • Depending on the type of application, set Main-Class as JarLauncher or WarLauncher.

        8. Delivering Spring Boot Executable JARs

        You can deliver your Spring Boot application as a single-packaged application to end users.

        Spring Boot Packaging Types

        1. jar
        2. war
        3. ear
        4. pom
        Spring Boot Packaging Types

        9. Spring Boot Packaging Pom – Simplifying Configuration

        Spring Boot provides tools like the spring-boot-maven-plugin and spring-boot gradle plugin to simplify packaging Spring Boot executable JARs.

        9.1 Using spring-boot-maven-plugin

        • Configure the plugin in your pom.xml to handle packaging as an executable JAR.
        XML
        <build>
          <plugins>
            <plugin>
              <groupId>org.springframework.boot</groupId>
              <artifactId>spring-boot-maven-plugin</artifactId>
              <version>2.7.12</version>
              <executions>
                <execution>
                  <phase>package</phase>
                  <goals>
                    <goal>repackage</goal>
                  </goals>
                </execution>
              </executions>
            </plugin>
          </plugins>
        </build>
        

        9.2 Using spring-boot gradle plugin

        • Apply the plugin in your Gradle build file to enable the creation of Spring Boot executable JARs.
        Groovy
        plugins {
          id 'org.springframework.boot' version: '2.7.12'
        }

        These plugins automate the process of building executable JARs, making it easier for developers.

        Conclusion

        Spring Boot’s executable JAR packaging standard allows for the delivery of self-contained Java applications, simplifying distribution and deployment, especially in cloud environments. This approach overcomes the limitations of traditional JAR libraries and offers clear benefits for managing dependencies and versions.

        Spring Boot Starter

        Spring-Boot-starters

        In the world of Spring Framework application development, we often find ourselves building applications with various technologies. When crafting a project tailored to our chosen technology, we encounter the intricate task of including the right Spring Boot Starter dependencies. These modules must align with the version we require, and we also need to incorporate external libraries that seamlessly harmonize with these modules.

        Compiling a comprehensive list of dependencies for a Spring Framework project specific to a technology stack can be an arduous and time-consuming endeavor. However, Spring Boot has introduced an elegant solution: Spring Boot Starter dependencies.

        Understanding Spring Boot Starter Dependencies

        Spring Boot Starter dependencies are essentially Maven projects, but with a unique twist. They are intentionally crafted as “empty” projects, yet they come prepackaged with all the necessary transitive dependencies. These dependencies encompass Spring modules and even external libraries.

        Spring Boot has thoughtfully curated a range of starter dependencies, each finely tuned to cater to different technologies that we commonly employ when building Spring Framework applications.

        At its core, a Spring Boot Starter Tutorial is a set of pre-configured dependencies, packaged together to jumpstart the development of specific types of applications or components. These starters contain everything you need to get up and running quickly, reducing the complexity of configuring your application manually.

        Dependency Management in Spring Boot

        Let’s break down the process:

        1. Select the Relevant Starter Dependency:
          • Depending on the technology stack you intend to utilize for your application, you can pinpoint the appropriate Spring Boot Starter dependency.
        2. Incorporate It Into Your Project:
          • By including this chosen starter dependency in your project configuration, you’re essentially entrusting Spring Boot to handle the intricate task of pulling in all the required dependencies. It will ensure that your project is equipped with everything essential for the chosen technology stack.

        Examples:

        Let’s explore a few examples of Spring Boot Starter dependencies:

        • Spring Framework 3.x:
          • spring-boot-starter-1.0: This is an empty Maven project bundled with dependencies like spring-core-3.4.2, spring-beans-3.4.2, and more.
        • Spring Framework 4.x:
          • spring-boot-starter-1.3: Another empty Maven project, but tailored for Spring Framework 4.x, including dependencies like spring-core-4.1, spring-beans-4.1, and more.

        Putting It All Together: Simplifying Dependency Management

        Imagine you’re embarking on a project, such as a Hospital Management System (HMS). Here’s how you can leverage Spring Boot Starter dependencies:

        1. Create a Maven Project:
          • Start by initiating a new Maven project for your application, ensuring that it’s structured properly.

        Example:

        Suppose you want to create a Maven project for a web application named “MyWebApp.”

        1. Open a Terminal or Command Prompt: Navigate to the directory where you want to create your project.
        2. Use Maven’s Archetype Plugin: Execute the following command to create a new Maven project using the “maven-archetype-webapp” archetype, which is suitable for web applications:
        Bash
        mvn archetype:generate -DgroupId=com.example -DartifactId=MyWebApp -DarchetypeArtifactId=maven-archetype-webapp -DinteractiveMode=false
        • -DgroupId: Specifies the project’s group ID, typically in reverse domain format (e.g., com.example).
        • -DartifactId: Sets the project’s artifact ID, which is the project’s name (e.g., MyWebApp).
        • -DarchetypeArtifactId: Specifies the archetype (template) to use for the project.

        3. Navigate to the Project Directory: Change your current directory to the newly created project folder:

        Bash
        cd MyWebApp

        4. Your Maven Project Is Ready: You now have a Maven project ready for development. You can start adding code and configuring your project as needed.

        Example:

        Suppose you want to add Spring Boot Starter dependencies for building a web application using Spring Boot.

        1. Open the pom.xml File: In your Maven project, locate the pom.xml file. This file is used to manage project dependencies.
        2. Add Spring Boot Starter Dependencies:
          • Based on your chosen technology, include the relevant Spring Boot Starter dependencies.
          • Ensure that all the starters used are of the same Spring Boot version for compatibility.
        3. Edit the pom.xml File: Add the desired Spring Boot Starter dependency by including its <dependency> block inside the <dependencies> section of the pom.xml file. For a web application, you can add the “spring-boot-starter-web” dependency:

        Spring boot starter dependency

        XML
        <properties>
            <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
            <maven.compiler.source>1.8</maven.compiler.source>
            <maven.compiler.target>1.8</maven.compiler.target>
        </properties>
        
        <dependencies>
            <!-- Other dependencies -->
            <dependency>
                <groupId>org.springframework.boot</groupId>
                <artifactId>spring-boot-starter-web</artifactId>
                 <version>2.7.15</version>
            </dependency>
        </dependencies>
        • <groupId>: Specifies the group ID for the dependency (in this case, “org.springframework.boot”).
        • <artifactId>: Sets the artifact ID for the dependency (e.g., “spring-boot-starter-web”).

        3. Save the pom.xml File: After adding the dependency, save the pom.xml file. Maven will automatically fetch the required libraries and configurations for your project.

        4. Build Your Project: To apply the changes, build your project using the following command:

        Bash
        mvn clean install

        Maven will download the necessary “Spring Boot Starter” dependencies and make them available for your project.

        With these steps, you’ve successfully created a Maven project and added “Spring Boot Starter” Dependencies, making it ready for Spring Boot application development

        Below is a screenshot illustrating the generated project, ‘MyWebApp,’ along with the added starter dependencies.

        spring boot starter

        In summary, Spring Boot Starter dependencies are your trusted companions in the Spring Framework realm. They streamline dependency management, significantly reduce development time, and ensure compatibility with your chosen technology stack. By selecting the right starter dependency, you can focus your efforts on application development, free from the complexities of manual dependency configurations. Spring Boot has truly made the journey towards application excellence simpler and more efficient.

        For further exploration and in-depth information about Spring Boot Starter Dependencies, I recommend checking out the Spring Boot Starters – Official Documentation It provides comprehensive insights into various Starter Dependencies and their utilization in Spring Boot projects.

        Spring Cloud Config Server Without Git

        Spring Cloud Config Server

        The Spring Cloud Config Server: Decentralized Configuration Management

        The Spring Cloud Config Server empowers us to extract our microservice application’s configuration to an external repository and distribute it across the network in a decentralized manner. This decentralization facilitates convenient accessibility.

        Advantages of spring cloud config server

        Utilizing the Spring Cloud Config Server offers a significant advantage: the ability to modify service configurations externally, without necessitating changes to the application source code. This circumvents the need to rebuild, repackage, and redeploy the microservice application across various cluster nodes.

        Embedding application configuration within the application itself can lead to several complications:

        1. Rebuilding for Configuration Changes: Each configuration change requires rebuilding the application, yielding a new artifact version (jar).
        2. Containerized Environments: In containerized environments, producing and publishing new versions of containerized images (e.g., Docker images) becomes necessary.
        3. Complex Redeployment Process: Identifying running service instances, stopping, redeploying the new service version, and restarting it becomes a complex and time-consuming endeavor, involving multiple teams.
        4. Real-Time Configuration Updates: The Spring Cloud Config Server enables configurations to be updated in real time without service interruption, enhancing agility in response to changing requirements.
        5. Centralized Management: All configurations can be centrally managed and versioned, ensuring consistency and streamlined change tracking.
        6. Decoupling Configurations: By externalizing configurations, services are detached from their configuration sources, simplifying the independent management of configurations.
        7. Consistency Across Environments: The Config Server guarantees uniform configurations across various environments (development, testing, production), reducing discrepancies and errors.
        8. Rollback and Auditing: With version control and historical tracking, reverting configurations and auditing changes becomes seamless.
        9. Enhanced Security and Access Control: The Config Server incorporates security features for controlling access to and modification of configurations, reinforcing data protection.

        Spring Cloud: A Solution for Easier Configuration Management

        To address these challenges, Spring introduces the Spring Cloud module, encompassing the Config Server and Config Client tools. These tools aid in externalizing application configuration in a distributed manner, streamlining configuration management. This approach delivers the following benefits:

        • The ConfigServer/ConfigClient tools facilitate the externalization of application configuration in a distributed fashion.
        • Configuration can reside in a Git repository location, obviating the need to embed it within the application.
        • This approach expedites configuration management and simplifies the process of deploying and maintaining the application.

        By adopting the ConfigServer and ConfigClient tools, Spring Cloud simplifies the management of application configuration, enhancing efficiency and minimizing the time and effort required for deployment and maintenance.

        Building Spring Cloud Config Server Example

        To build the Spring Cloud Config Server, you can use Maven/gradle as your build tool. Below is the pom.xml file containing the necessary dependencies and configuration for building the config server:

        Spring Cloud Config Server Dependency
        <dependency>
        		<groupId>org.springframework.boot</groupId>
        			<artifactId>spring-boot-starter-web</artifactId>
        			<exclusions>
        				<exclusion>
        					<groupId>org.springframework.boot</groupId>
        					<artifactId>spring-boot-starter-tomcat</artifactId>
        				</exclusion>
        			</exclusions>
        		</dependency>
        		<dependency>
        			<groupId>org.springframework.cloud</groupId>
        			<artifactId>spring-cloud-config-server</artifactId>
        		</dependency>
        </dependency>
        
        Activating Spring Cloud Config Server Within Your Spring Boot App
        @EnableConfigServer
        @SpringBootApplication
        public class CloudConfigServerApplication {
            public static void main(String[] args) {
                SpringApplication.run(CloudConfigServerApplication.class, args);
            }
        }
        

        By adding the @EnableConfigServer annotation, you activate the Spring Cloud Config Server features within your application.

        Include the following configurations in the properties

        To configure your Spring Cloud Config Server, you can make use of the application.properties file. For instance:

        server.port=8888
        spring.application.name=cloud-config-server
        server.servlet.context-path=/api/v1
        
        Customizing Configuration Search Locations with application-native.properties

        Furthermore, you can include configurations specific to the application-native.properties file. If your configuration client searches for configurations in the classpath’s /configs folder, you can specify this in the properties:

        1. Create an application-native.properties file in the resources folder.
        2. Include the following configuration in the file to define the search locations:
        spring.cloud.config.server.native.searchLocations=classpath:/configs,classpath:/configs/{application}
        

        With these configurations in place, your Spring Cloud Config Server will be primed to handle configuration management effectively.

        Generate a configuration file using the exact name as the config client application, and craft properties files corresponding to different environments. For instance:

        Spring cloud config server

        Include the following properties within the cloud-config-client-dev.properties file. You can adjust the properties according to the specific profiles:

        spring.application.name=cloud-config-client
        server.port=8080
        student.name=Sachin
        student.rollNo=1234
        student.email=sachin@gmail.com
        student.phone=123456789
        

        To initiate the application, provide the subsequent VM argument:

        -Dspring.profiles.active=local,native
        

        For further reference, you can access the source code on GitHub at: https://github.com/askPavan/cloud-config-server

        Exploring What is Spring Boot and Its Features

        What-is-spring-boot

        The Spring Framework provides a lot of functionality out of the box. What is Spring Boot? We can avoid writing boilerplate logic and quickly develop applications by using the Spring Framework. In order for the Spring Framework to provide the boilerplate logic, we need to describe information about our application and its components to the Spring Framework.

        It’s not just our application components that need configuration; even the classes provided by the Spring Framework have to be configured as beans within the Spring Framework.

        Exploring Spring Framework Configuration Options

        XML-based Configuration: A Classic Approach

        • One way we can provide configuration information about our class to the Spring Framework is by using XML-based configuration, specifically the Spring Bean Configuration File. Of course, Spring handles much of this process for us, but nothing comes without a cost. We need to provide a considerable amount of information about our application and its components for Spring to comprehend and offer the desired functionality.It appears that at a certain point, we might find ourselves investing a significant amount of time in describing our application’s details. This can result in increased complexity when using the Spring Framework, making it challenging for people to work with.
        • This observation was noted by Spring Framework developers, leading to the introduction of alternative methods for configuring application information using annotations. Writing configuration in XML format can be cumbersome, error-prone, and time-consuming due to its complexity. Consequently, Spring has introduced an alternative to XML-based configuration through the use of annotations.

        Stereotype Annotations: A Leap Forward

        • That’s where Spring has introduced stereotype annotations to expedite the configuration of our application classes within the Spring Framework. Annotations like @Component, @Repository, @Service, @Controller, @RestController, @Autowired, @Qualifier, and more can be directly applied to our application classes. This approach helps us bypass the need for XML-based configuration.
        • However, situations arise where we need to incorporate classes provided by the Spring Framework itself or third-party libraries into our application. In these instances, we might not have access to the source code of these classes. Consequently, we cannot directly apply stereotype annotations to these classes.
        • Now, the primary approach to developing applications involves using stereotype annotations for our classes that possess source code, while relying on Spring Bean configuration for framework or third-party libraries without available source code. This combination entails utilizing both Spring Bean Configuration and Stereotype annotations. However, it appears that we haven’t entirely resolved the initial issue. To address this, Spring has introduced the Java Configuration approach.
        • Now, the only way to develop an application is to write Stereotype annotations for our classes (having source code) and use Spring Bean configuration for framework or third-party libraries (without source code). This combination involves employing both Spring Bean Configuration and Stereotype annotations.
        • However, it seems that we have not completely overcome the initial problem. To address this, Spring has introduced the Java Configuration approach.

        Java Configuration: Bridging the Gap

        • Spring has introduced the Java Configuration approach, where instead of configuring classes without source code in a Spring Bean Configuration file, we can write their configuration in a separate Java Configuration class. Advantages:
          • No need to memorize XML tags for configuration.
          • Type-safe configuration.
        • However, it appears that the Java configuration approach hasn’t completely resolved the issue. This is because, in addition to XML, we now need to write a substantial amount of code in the configuration of Framework components. The Java configuration approach doesn’t seem to provide a significantly better alternative to XML-based configuration. Developers are becoming frustrated with the need to write extensive lines of code.

        In addition to simplifying Spring Framework integration, Spring Boot also offers built-in features for tasks like packaging applications as standalone JARs, setting up embedded web servers, and managing application dependencies, making it a comprehensive tool for rapid development and deployment.

        What does Spring Boot, what is Spring Boot, provide?

        Spring boot is an module that addresses the non-functional requirements in building an Spring Framework based application. 

        Advantages

        In general people this Spring Boot can be used for building an functional aspect of an application for e.g.. spring jdbc is used for building persistency-tier of an application similarly spring mvc can be used for building web applications. Unlike these modules spring boot is not used for building any of the functional aspects of an application rather it helps the developers in speeding up the development of a Spring based application.

        How and in which Spring Boot helps us in building the Spring Framework applications fast?

        Spring boot features

        1. Auto Configurations
        2. Starter Dependencies
        3. Actuator Endpoints
        4. DevTools [Development Feature]:
        5. Embedded Container
        6. Spring Boot CLI

        1. Auto Configurations:

        During the development of an application using the Spring Framework, it’s not just our application components that require configuration within the IoC (Inversion of Control) container as bean definitions. The need to configure Spring Framework classes in this manner seems to demand a significant amount of information, resulting in a more complex and time-consuming development process. This is where the concept of auto-configuration steps in.

        • Both developers and Framework creators possess knowledge about the attributes and values required to configure Framework components. Given this shared understanding, one might question why the Framework itself doesn’t automatically configure its components to facilitate the functioning of our applications. This is the essence of Auto Configurations.
        • Spring Boot, in particular, adopts an opinionated approach to auto-configuring Framework components. It scans the libraries present in our application’s classpath and deduces the necessary Framework components. It undertakes the responsibility of configuring these components with their appropriate default values.
        • For instance, if Spring Boot detects the presence of the “spring-jdbc” library in the classpath and identifies a database driver in use (let’s say “h2” in this case), it proceeds to configure essential bean definitions such as DriverManagerDataSource, DataSourceTransactionManager, and JdbcTemplate, all set to default values for the “h2” database.
        • Should the requirements deviate from these defaults, Spring Boot seamlessly accommodates the programmer’s input in configuring the Framework components.
        • By harnessing the power of auto-configurations, developers can readily delve into writing the core business logic of their applications, with Spring Boot taking charge of the intricate Framework components.
        • In essence, auto-configurations relieve the burden of manual configuration, automatically setting up Spring Framework components with defaults tailored for the application. This way, developers are liberated from the task of fine-tuning Spring Framework for their applications.

        2. Starter Dependencies:

        • Spring Boot provides Maven archetypes designed to expedite the configuration of project dependencies. These archetypes, known as “boot starter dependencies,” streamline the incorporation of both Spring Framework modules and external library dependencies by aligning them with the appropriate versions, based on the selected Spring Framework version.
        • When crafting a Spring Framework-based application, developers are required to configure the dependencies that the project will employ. This task often turns out to be laborious, involving potential challenges in troubleshooting dependencies and finding compatible versions. Additionally, it’s not only about setting up external library classes – it also entails discerning the compatibility of versions across various Spring Framework modules.
        • Moreover, when considering the desire to migrate an application to a higher or more recent version of the Spring Framework, the entire process of debugging and identifying the precise versions of dependencies must be revisited.
        • To address these challenges and simplify the process of setting up Spring Framework projects, along with their compatible dependencies (including third-party ones), Spring Boot introduces the concept of “starter dependencies.”
        • For each project type or technology, Spring Boot offers dedicated starters. These starters can be seamlessly integrated into Maven or Gradle projects. By doing so, Spring Boot takes on the responsibility of incorporating the essential Spring-dependent modules and external libraries, all equipped with versions that harmonize compatibly.

        3. Actuator Endpoints:

        Using Spring Boot, we have the capability to develop applications that smoothly transition from development to production-grade deployment. Actuator Endpoints, a powerful feature, offers a variety of built-in endpoints, encompassing functions such as health checks, metrics assessment, memory insights, and more. Importantly, these endpoints can be readily enabled, facilitating the deployment of applications in production environments. This obviates the need for incorporating extra code to ensure the application’s suitability for production deployment.

        • Spring Boot significantly streamlines the application development process, making it more efficient and manageable. One of its standout features is the inclusion of Actuator Endpoints. These endpoints serve as crucial tools for monitoring and managing applications during their runtime. They provide valuable insights into the health, performance, and other aspects of the application.
        • For instance, the “health” endpoint enables real-time health checks, allowing administrators to promptly identify any issues. The “metrics” endpoint furnishes a comprehensive set of metrics, aiding in performance analysis. Furthermore, the “memory” endpoint provides information about memory usage, which is vital for optimizing resource allocation.
        • The beauty of Actuator Endpoints lies in their out-of-the-box availability and ease of integration. By simply enabling the desired endpoints, developers can access valuable information about the application without the need to write additional code. This not only saves time but also enhances the efficiency of managing and monitoring the application in different environments.

        4. DevTools [Development Feature]:

        • Debugging code becomes remarkably efficient with the aid of DevTools. Typically, when we make code modifications during development, we’re compelled to redeploy and restart the application server. Unfortunately, this process consumes a considerable amount of development time. However, DevTools brings a refreshing change. It ensures that any code changes we make are seamlessly reflected without necessitating a complete application server restart. Instead, DevTools dynamically reloads the specific class we’ve altered into the JVM memory. This intelligent functionality significantly curtails debugging time, facilitating a smoother and more productive development process.

        5. Embedded Container:

        • The concept of an embedded container is a remarkable feature that enhances the development process. In this approach, the server is integrated into the project as a library. Consequently, you can execute your project directly from the codebase. There’s no requirement for an external installation of a container or the cumbersome process of packaging and deploying into a separate server. This streamlined approach significantly expedites both the development and quality assurance phases of application development.

        6. Spring Boot CLI:

        The Spring Boot Command Line Interface (CLI) introduces a powerful tool to swiftly develop and execute prototype code. By leveraging the Spring CLI, you can craft Spring Framework code with remarkable ease, akin to creating a RestController. This code can then be promptly executed using the CLI.

        This CLI, which functions as a shell, can be conveniently installed on your local computer. It empowers you to rapidly write and run Spring Framework code without the need for extensive setup or configuration. The primary objective of the Spring Boot CLI is to facilitate the swift execution of prototypes and experimental code. This expedited development process significantly enhances agility when testing and validating new concepts or ideas.

        Summary of Features Here’s a concise summary of the key features offered by Spring Boot

        1. Jump-Start Experience: Spring Boot provides a seamless starting point for building Spring Framework applications, accelerating the setup process.
        2. Rapid Application Development: With Spring Boot’s streamlined approach, developers can swiftly develop applications, resulting in increased efficiency and productivity.
        3. Auto Configurations: The auto-configuration feature efficiently configures Framework components with default settings. In cases where requirements differ, simple configurations allow for easy tuning of components.
        4. Production-Grade Deployment: Spring Boot empowers the deployment of applications that meet production-grade standards, ensuring stability and reliability.
        5. Enhanced Non-Functional Aspects: Beyond core functionality, Spring Boot addresses non-functional aspects of application development. This includes features like debugging, automatic restart during development, and robust tools for metrics and memory management.

        In essence, Spring Boot revolutionizes Spring Framework application development by offering an array of capabilities that streamline the process, bolster production readiness, and enhance the development experience.

        Further Reading:

        Spring Boot Official Documentation: Explore the official documentation for comprehensive information about Spring Boot’s features, configurations, and best practices.

        Spring Cloud Config Client

        Spring Cloud Config Client Example

        During the boot-up of the service, the Spring Cloud Config Client connects to the config server and fetches the service-specific configuration over the network. This configuration is then injected into the Environment object of the IOC (Inversion of Control) container, which is used to start the application.

        Create the Spring Boot Project

        Let’s kick off by creating a Spring Boot Maven project named “spring-cloud-config-client.” To achieve this, there are two paths you can take: either visit the Spring Initializer website Spring Initializer or leverage your trusted Integrated Development Environment (IDE). The resulting project structure is as follows.

        Spring cloud config client

        Spring Cloud Config Client Example

        To understand the implementation of Spring Cloud Config Client, let’s walk through a hands-on example.

        Begin by creating a Spring Boot project and adding the following dependencies to your pom.xml:

        <dependency>
          <groupId>org.springframework.cloud</groupId>
          <artifactId>spring-cloud-starter-config</artifactId>
        </dependency>
        <dependency>
          <groupId>org.springframework.cloud</groupId>
          <artifactId>spring-cloud-starter-bootstrap</artifactId>
        </dependency>

        Implementing Spring Cloud Config Client in Microservices: A Step-by-Step Guide

        In your main application class, the heart of your Spring Boot application, bring in essential packages and annotations. Introduce the @RefreshScope annotation, a key enabler for configuration refreshing. Here’s a snippet to illustrate:

        Java
        @SpringBootApplication
        @RefreshScope
        public class CloudConfigClientApplication implements ApplicationRunner{
        
        	@Autowired
        	private StudentsController studentsController;
        	
        	@Value("${author}")
        	private String author;
        	
        	public static void main(String[] args) {
        		SpringApplication.run(CloudConfigClientApplication.class, args);
        	}
        
        	@Override
        	public void run(ApplicationArguments args) throws Exception {
        		System.out.println(studentsController.getStudentDetails().getBody());
        		System.out.println("Author ** "+author);
        	}
        
        }

        Include the following configurations in your application.properties or application.yml file to set up Spring Cloud Config

        management:
          endpoint:
            refresh:
              enabled: true
          endpoints:
            web:
              exposure:
                include:
                - refresh
              
        spring:
          application:
            name: cloud-config-client
          config:
            import: configserver:http://localhost:8888/api/v1
          profiles:
            active: dev
          main:
            allow-circular-references: true
        

        In your main class, import necessary packages and annotations. Add the @RefreshScope annotation to enable configuration refresh. Here’s an example:

        @SpringBootApplication
        @RefreshScope
        public class CloudConfigClientApplication implements ApplicationRunner{
        
        	@Autowired
        	private StudentsController studentsController;
        		
        	public static void main(String[] args) {
        		SpringApplication.run(CloudConfigClientApplication.class, args);
        	}
        	
        	//printing student details from config server.
        	@Override
        	public void run(ApplicationArguments args) throws Exception {
        		System.out.println(studentsController.getStudentDetails().getBody());
        	}
        
        }

        Here’s an example of a simple Student bean class

        public class Student {
        
        	private String studentName;
        	private String studentRollNo;
        	private String studentEmail;
        	private String phone;
        	//Generate getters and setters
        }

        Create a student REST controller

        @RestController
        @RequestMapping("/api/v1")
        public class StudentsController {
        
        	@Autowired
        	private Environment env;
        		
        	@GetMapping("/students")
        	public ResponseEntity<Student> getStudentDetails(){
        		Student student = new Student();
        		student.setStudentName(env.getProperty("student.name"));
        		student.setStudentRollNo(env.getProperty("student.rollNo"));
        		student.setStudentEmail(env.getProperty("student.email"));
        		student.setPhone(env.getProperty("student.phone"));
        		return new ResponseEntity<Student>(student, HttpStatus.OK);
        	}
        }
        1. Start the Spring Cloud Config Server: Before setting up the Spring Cloud Config Client, ensure the Spring Cloud Config Server is up and running.
        2. Start the Spring Cloud Config Client: Next, initiate the Spring Cloud Config Client by starting your application with the desired profile and Spring Cloud Config settings using the command below:
        Bash
        java  -Dspring.profiles.active=dev  -jar target/your-application.jar

        Replace:

        • dev with the desired profile (dev, sit, uat, etc.).
        • http://config-server-url:8888 with the actual URL of your Spring Cloud Config Server.
        • your-application.jar with the name of your application’s JAR file.

        After starting the application with the specified Spring Cloud Config settings, you can access the following local URL: http://localhost:8081/api/v1/students. The output looks like below when you hit this endpoint:

        {
            "studentName": "Sachin",
            "studentRollNo": "1234",
            "studentEmail": "sachin1@gmail.com",
            "phone": "123456781"
        }

        For more information on setting up and using Spring Cloud Config Server, you can refer Spring Config Server blog post at https://javadzone.com/spring-cloud-config-server/.

        In a nutshell, Spring Cloud Config Client enables seamless integration of dynamic configurations into your Spring Boot application, contributing to a more adaptive and easily maintainable system. Dive into the provided example and experience firsthand the benefits of efficient configuration management. If you’d like to explore the source code, it’s available on my GitHub Repository: GitHub Repository Link. Happy configuring!

        Spring Boot Eureka Discovery Client

        Spring Boot Eureka Discovery Client

        In today’s software landscape, microservices are the building blocks of robust and scalable applications. The Spring Boot Eureka Discovery Client stands as a key enabler, simplifying the intricate web of microservices. Discover how it streamlines service discovery and collaboration.

        Spring Boot Eureka Client Unveiled

        Diving Deeper into Spring Boot Eureka Client’s Vital Role

        The Spring Boot Eureka Client plays an indispensable role within the Eureka service framework. It serves as the linchpin in the process of discovering services, especially in the context of modern software setups. This tool makes the task of finding and working with services in microservices exceptionally smooth.

        Your Guide to Effortless Microservices Communication

        Navigating Microservices with the Spring Boot Eureka Client

        Picture the Eureka Discovery Client as an invaluable guide in the world of Eureka. It simplifies the intricate process of connecting microservices, ensuring seamless communication between different parts of your system.

        Spring Boot Eureka Discovery Client as Your Service Discovery Library

        Delving Deeper into the Technical Aspects

        From a technical standpoint, think of the Eureka Discovery Client as a library. When you integrate it into your microservices, it harmonizes their operation with a central Eureka Server, acting as a hub that keeps real-time tabs on all available services across the network.

        Empowering Microservices with Spring Boot Eureka Client

        Discovering and Collaborating with Ease

        Thanks to the Eureka Discovery Client, microservices can effortlessly join the network and discover other services whenever they need to. This capability proves invaluable, particularly when dealing with a multitude of services that require quick and efficient collaboration.

        Simplifying Setup, Strengthening Microservices

        Streamlining Setup Procedures with the Spring Boot Eureka Client

        One of the standout advantages of the Eureka Discovery Client is its ability to simplify the often complex setup procedures. It ensures that services can connect seamlessly, freeing you to focus on enhancing the resilience and functionality of your microservices.

        Getting Started with Spring Boot Eureka Client

        Your Journey Begins Here

        If you’re contemplating the use of the Spring Boot Eureka Client, here’s a step-by-step guide to set you on the right path:

        Setting Up Eureka Server

        Establishing Your Eureka Server as the Central Registry

        Before integrating the Eureka Discovery Client, you must have a fully operational Eureka Server. This server serves as the central registry where microservices register themselves and discover other services. For detailed instructions, refer to the Eureka Server Setup Guide.

        Adding Dependencies for Spring Boot Eureka Discovery Client

        Integrating Essential Dependencies into Your Microservice Project

        In your microservice project, including the required dependencies is essential. If you’re leveraging Spring Boot, add the spring-cloud-starter-netflix-eureka-client dependency to your project’s build file. For instance, in a Maven project’s pom.xml or a Gradle project’s build.gradle:

        Eureka Discovery Client Maven Dependency

        XML
        <dependency>
            <groupId>org.springframework.cloud</groupId>
            <artifactId>spring-cloud-starter-netflix-eureka-client</artifactId>
        </dependency>
        

        Eureka Client Gradle Dependency

        Integrating the Eureka Client Dependency in Your Gradle Project

        To include the Spring Boot Eureka Client in your Gradle project, add the following dependency to your build.gradle file:

        Java
        dependencies {
            implementation 'org.springframework.cloud:spring-cloud-starter-netflix-eureka-client'
        }

        Configuring Application Properties

        Optimizing the Spring Boot Eureka Client Configuration

        Tailoring your microservice’s properties and Eureka client settings in the application.properties file is crucial for optimal usage of the Spring Boot Eureka Client. Below is a sample configuration:

        Java
        spring:
          application:
            name: eureka-discovery-client-app
        server:
          port: 8089
        eureka:
          client:
            register-with-eureka: true
            fetch-registry: false
            service-url:
              defaultZone: http://localhost:8761/eureka/,http://localhost:8762/eureka/
          instance:
             preferIpAddress: true
        

        Enabling Spring Boot Eureka Discovery Client

        Activating the Power of Spring Boot Eureka Client

        To enable the Spring Boot Eureka Client functionality in your Java code, annotate your main application class as shown below:

        Java
        @EnableDiscoveryClient
        @SpringBootApplication
        public class EurekaClientApplication {
        
        	public static void main(String[] args) {
        		SpringApplication.run(EurekaClientApplication.class, args);
        	}
        }
        Service Registration and Discovery

        Automated Service Registration and Effortless Discovery

        Once your microservice initializes, it will autonomously register itself with the Eureka Server, becoming a part of the network. You can confirm this registration by examining the Eureka Server’s dashboard. Simply visit your Eureka Server’s URL, e.g., http://localhost:8761/

        Spring Boot Eureka Discovery Client
        Seamlessly Discovering Services

        Locating Services in Your Microservices Architecture

        To locate other services seamlessly within your microservices architecture, leverage the methods provided by the Eureka Discovery Client. These methods simplify the retrieval of information regarding registered services. Programmatically, you can acquire service instances and their corresponding endpoints directly from the Eureka Server.

        For further reference and to explore practical examples, check out the source code illustrating this process on our GitHub repository.

        Reload Application Properties in Spring Boot: 5 Powerful Steps to Optimize

        Refresh Configs without restart

        In the world of Microservices architecture, efficiently managing configurations across multiple services is crucial. “Reload Application Properties in Spring Boot” becomes even more significant when it comes to updating configurations and ensuring synchronization, as well as refreshing config changes. However, with the right tools and practices, like Spring Cloud Config and Spring Boot Actuator, this process can be streamlined. In this guide, we’ll delve into how to effectively propagate updated configurations to all Config Clients (Microservices) while maintaining synchronization.

        Spring Cloud Config Server Auto-Reload

        When you make changes to a configuration file in your config repository and commit those changes, the Spring Cloud Config Server, configured to automatically reload the updated configurations, becomes incredibly useful for keeping your microservices up to date.

        To set up auto-reloading, you need to configure the refresh rate in the Config Server’s configuration file, typically located in application.yml or application.properties. The “Reload Application Properties in Spring Boot” guide will walk you through this process, with a focus on the refresh-rate property, specifying how often the Config Server checks for updates and reloads configurations.

        spring:
          cloud:
            config:
              server:
                git:
                  refresh-rate: 3000 # Set the refresh rate in milliseconds
        

        Refresh Config Clients with Spring Boot Actuator

        To get started, add the Spring Boot Actuator dependency to your microservice’s project. You can do this by adding the following lines to your pom.xml:

        <dependency>
            <groupId>org.springframework.boot</groupId>
            <artifactId>spring-boot-starter-actuator</artifactId>
        </dependency>
        

        While the Config Server reloads configurations automatically, the updated settings are not automatically pushed to the Config Clients (microservices). To make sure these changes are reflected in the Config Clients, you must trigger a refresh.

        This is where the “Reload Application Properties in Spring Boot” guide becomes crucial. Spring Boot Actuator provides various management endpoints, including the refresh endpoint, which is essential for updating configurations in Config Clients.

        Reload Application Properties in Spring Boot: Exposing the Refresh Endpoint

        Next, you need to configure Actuator to expose the refresh endpoint. This can be done in your microservice’s application.yml or .properties file:

        management:
          endpoint:
            refresh:
              enabled: true
          endpoints:
            web:
              exposure:
                include: refresh
        

        Java Code Example

        Below is a Java code example that demonstrates how to trigger configuration refresh in a Config Client microservice using Spring Boot:

        import org.springframework.beans.factory.annotation.Value;
        import org.springframework.boot.SpringApplication;
        import org.springframework.boot.autoconfigure.SpringBootApplication;
        import org.springframework.cloud.context.config.annotation.RefreshScope;
        import org.springframework.web.bind.annotation.GetMapping;
        import org.springframework.web.bind.annotation.RestController;
        import org.springframework.cloud.context.refresh.ContextRefresher;
        
        @SpringBootApplication
        public class ConfigClientApplication {
        
            public static void main(String[] args) {
                SpringApplication.run(ConfigClientApplication.class, args);
            }
        }
        
        @RestController
        @RefreshScope
        public class MyController {
        
            @Value("${example.property}")
            private String exampleProperty;
        
            private final ContextRefresher contextRefresher;
        
            public MyController(ContextRefresher contextRefresher) {
                this.contextRefresher = contextRefresher;
            }
        
            @GetMapping("/example")
            public String getExampleProperty() {
                return exampleProperty;
            }
        
            @GetMapping("/refresh")
            public String refresh() {
                contextRefresher.refresh();
                return "Configuration Refreshed!";
            }
        }
        
        1. Reload Application Properties:
        • To trigger a refresh in a Config Client microservice, initiate a POST request to the refresh endpoint. For example: http://localhost:8080/actuator/refresh.
        • This request will generate a refresh event within the microservice.

        Send a POST Request with Postman Now, open Postman and create a POST request to your microservice’s refresh endpoint. The URL should look something like this:

        Reload Application Properties in Spring Boot

        2. Bean Reloading:

        • Configurations injected via @Value annotations in bean definitions adorned with @RefreshScope will be reloaded when a refresh is triggered.
        • If values are injected through @ConfigurationProperties, the IOC container will automatically reload the configuration.

        By following these steps and incorporating the provided Java code example, you can effectively ensure that updated configurations are propagated to your Config Clients, and their synchronization is managed seamlessly using Spring Cloud Config and Spring Boot Actuator. This approach streamlines configuration management in your microservices architecture, allowing you to keep your services up to date efficiently and hassle-free.

        “In this guide, we’ve explored the intricacies of Spring Cloud Config and Spring Boot Actuator in efficiently managing and refreshing configuration changes in your microservices architecture. To delve deeper into these tools and practices, you can learn more about Spring Cloud Config and its capabilities. By leveraging these technologies, you can enhance your configuration management and synchronization, ensuring seamless operations across your microservices.”

        Related Articles:

        Spring Boot Eureka Server Tutorial

        In the ever-evolving realm of microservices architecture, services like Netflix Eureka Server are spread across various nodes within a cluster. Unlike monolithic applications, where services are tightly integrated, microservices often run on specific cluster nodes, presenting a challenge for client applications striving to connect with them.

        Introduction: Simplifying Microservices with Eureka Server

        Microservices are a powerful architectural approach for building scalable and maintainable systems. However, in a distributed microservices environment, locating and connecting with individual services can be complex. This is where the Netflix Eureka Server comes to the rescue. Eureka Server simplifies service discovery, enabling microservices to effortlessly locate and communicate with each other within a cluster.

        Understanding Eureka Server

        Eureka Server, often referred to as Netflix Eureka Server, acts as a centralized service registry within a microservices cluster. During initialization, each microservice registers its information with the Eureka Server. This typically includes the service’s name, network location, and other pertinent details.

        Real-World Example: Eureka Server in Action

        To better understand the practical utility of Eureka Server, let’s delve into a real-world example. Imagine you’re responsible for building a large-scale e-commerce platform composed of various microservices. These microservices include the product catalog, user authentication, payment processing, order management, and more.

        In a microservices-based architecture, these services may be distributed across different servers or containers within a cloud-based environment. Each service needs to communicate with others efficiently to provide a seamless shopping experience for customers.

        This is where Eureka Server comes into play. By integrating Eureka Server into your architecture, you create a centralized service registry that keeps track of all available microservices. Let’s break down how it works:

        1. Service Registration: Each microservice, such as the product catalog or payment processing, registers itself with the Eureka Server upon startup. It provides essential information like its name and network location.
        2. Heartbeats: Microservices send regular heartbeats to Eureka Server to indicate that they are operational. If a service stops sending heartbeats (e.g., due to a failure), Eureka Server can mark it as unavailable.
        3. Service Discovery: When one microservice needs to communicate with another, it queries the Eureka Server to discover the service’s location. This eliminates the need for hardcoding IP addresses or endpoints, making the system more dynamic and adaptable.
        4. Load Balancing: Eureka Server can also help with load balancing. If multiple instances of a service are registered, Eureka can distribute requests evenly, improving system reliability and performance.

        In our e-commerce example, the product catalog service can easily locate and interact with the payment processing service using Eureka Server. As traffic fluctuates, Eureka Server ensures that requests are distributed optimally, preventing overloading on any single instance.

        By employing Eureka Server, you streamline the development, deployment, and scaling of your microservices-based e-commerce platform. It simplifies service discovery and enhances the overall reliability of your system.

        This real-world example demonstrates how Eureka Server can be a game-changer in managing and scaling microservices, making it a valuable tool in modern software development.

        Eureka Server Spring Boot Integration

        One of the strengths of Eureka Server is its seamless integration with the Spring Boot framework through Spring Cloud. By incorporating the spring-cloud-starter-eureka-server dependency into your project, configuring the server becomes straightforward. This simplification expedites the setup process, allowing microservices, especially those built with Spring Boot, to quickly join the Eureka ecosystem.

        Initiating the Project Spring cloud config client project

        Let’s kick off by creating a Spring Boot Maven project named “eureka-server” To achieve this, there are two paths you can take: either visit the Spring Initializer website Spring Initializer or leverage your trusted Integrated Development Environment (IDE). The resulting project structure is as follows.

        eureka server

        Implementing Eureka Server

        Maven Dependency for Eureka Server

        For projects managed with Maven, you’ll often search for the following dependency to include in your pom.xml file:

        XML
        <dependency>
            <groupId>org.springframework.cloud</groupId>
            <artifactId>spring-cloud-starter-netflix-eureka-server</artifactId>
        </dependency>

        Gradle Dependency for Eureka Server

        If you prefer Gradle for your project, many search for this dependency to add to your build.gradle file:

        XML
        dependencies {
            implementation 'org.springframework.cloud:spring-cloud-starter-netflix-eureka-server'
        }

        Eureka Server Configuration

        To configure Eureka Server, create an application.yml or application.properties file. Below is an example configuration in YAML format:

        Java
        spring:
          application:
            name: eureka-server
        
        server:
          port: 8761
        eureka:
          client:
            register-with-eureka: false
            fetch-registry: false
            healthcheck:
              enabled: true

        Eureka Server Configuration

        Java
        @EnableEurekaServer
        @SpringBootApplication
        public class EurekaServerApplication {
            public static void main(String[] args) {
                SpringApplication.run(EurekaServerApplication.class, args);
            }
        }

        Running the Eureka Server Application

        To begin using Eureka Server, follow these steps to run the application on your local machine without any plagiarism:

        1. Clone the Repository:
        • Launch your terminal and navigate to the desired directory where you intend to clone the Eureka Server repository.
        • Execute the following command to clone the repository without any copied content:
        Bash
        git clone https://github.com/askPavan/eureka-server

        2. Build the Application:

        • Go to the directory where you have cloned the Eureka Server repository.
        • Utilize the following command to build the Eureka Server application:

        3. Run the application.

        4. Access the Eureka Server Dashboard:

        • Once the server is up and running, open your web browser.
        • Enter the following URL to access the Eureka Server dashboard:
        Java
        http://localhost:8761/

        For the Eureka Client application, you can use the following URL: Eureka Client App URL

        4. View the Eureka Server Output:

        • You will now see the Eureka Server dashboard, which displays information about the registered services and their status.
        • Explore the dashboard to see the services that have registered with Eureka Server.

        Example Output

        Here is an example of what the Eureka Server dashboard might look like once the server is running:

        what is eureka server

        By running both the Eureka Server and Eureka Client applications, you can observe how services are registered and discovered in the Eureka ecosystem. This hands-on experience will help you better understand the functionality of Eureka Server and its interaction with client applications. For the source code of the Eureka Client, you can refer to this GitHub repository.

        Exploring Practical Examples

        For hands-on experience and practical illustrations, you can explore our GitHub repository. This repository contains real-world implementations of Eureka Server using Spring Boot.

        Conclusion: Simplifying Microservices with Eureka Server

        In conclusion, Eureka Server is a potent tool for simplifying microservices in a distributed architecture. Its seamless integration with Spring Boot streamlines the setup process, enabling you to efficiently implement Eureka Server in your microservices ecosystem.

        Eureka Server facilitates effortless service discovery, allowing microservices to seamlessly identify and communicate with one another. This capability is indispensable for constructing robust and efficient distributed systems.

        What are Microservices?

        What are Microservices

        Spring Boot Microservices, an innovative approach to developing software applications, involve decomposing the software application into

        • Smaller
        • Independent
        • Deployable
        • Loosely coupled
        • Collaborative Services

        Services through which we can bring down the complexity in understanding the application and ease the delivery of the application.

        Before Moving to Microservices We need to understand Monolithic Architecture.

        Microservices vs Monolith

        What is Monolithic Application Architecture?

        Monolithic application has several modules as part of it, all of these modules are built as one single system and delivered as a single deployable artifact, which is nothing but monolithic application development architecture.

        Microservices Architecture:

        What are microservices

        What are the reasons for choosing monolithic architecture

        1. Easy to scale the applications.

        2. You won’t to adopt continuous integration and delivery for your application

        3. Developers will be able to quickly understand the application and be productive in development and delivery

        Advantages of Monolithic Architecture

        1. Achieving scalability is very easy :

        if the load on the system is high, then we can copy the single deployable artifact of our application across multiple servers across the cluster.

        2. Easy to understand

        The entire software system has been built out of one single code base, the entire team of developers knows everything about the system they are working on. understanding such a software system out of a single code base is very easily and developers can be productive in building and delivery the application

        3. As the entire system is build as a single deployable artifact we can easily achieve continuous integration and delivery(CI/CD) without any module dependency complexities

        4. Monolithic architecture-based application development is better suited for applications that are less/moderate in size but if the application grows bigger in size, managing the development and delivery aspects of the system through monolithic architecture brings lot of problems.

        Disadvantage of Monolithic Architecture

        1. The entire system is built out of single codebase:

        Many of the developers will be afraid to understand such a big system and feel very complicated in understanding and developing it.

        Many of the developers don’t know how to achieve modularity in writing the code due to which they quickly exploit the code base.

        A change impact is going to be very high and difficult to handle.

        2. Overloaded (IDE)

        Due to the huge code base, the ide’s cannot handle in managing the code.

        To develop the code sophisticatedly by the developer he should ensure the code is loaded and in clean state in ide, he can write compilable code.

        3. overloaded web containers

        Deploying a huge application, makes the container take more time in starting up, and during debugging the application repeated deployment of the application for verifying code changes takes lot of time and kills the developer’s productivity.

        4. Scalability

        In monolithic architecture scalability is achieved in one dimension only which means horizontal scaling. If the application receives more volumes of request, even though the traffic comes to one or few modules of the system, we can only scale the system as a whole by deploying on multiple servers due to which

          i). The cost of achieving the scalability is going to be very high, as the whole system is scaled up we need to buy big servers with huge computing capacity

          ii). different parts of the system as different computing requirements, like few modules are memory intensive, few modules are CPU intensive, during scaleup we cannot consider such requirements in scaling up the system

        5. Scaling up the Team is difficult

        The more/bigger the application grows, we need more resources in the team to work on, but handling such a huge team is going to be pain point, as people cannot independently work on separate functional modules due to dependencies. resource handling becomes much complex in distributing the work.

        6. Long-Term commitment to technology stack

        While building application with monolithic architecture, we need to be committed to an technical stack in building the application for a long-term. because adopting new technologies requires the entire system to be migrated as it exists as a single code base.

        7. CI/CD is going to be very difficult

        When and ever a module has been finished its development, we cannot release it independently as other modules code changes are also part of the same code base due to long term release planning and deployments are required.

        Microservices Architecture:

        Microservices using Spring Boot

        Microservices is an architectural style of developing software applications. In microservices based architecture we decompose the software application into

        • Smaller
        • Independent
        • Deployable
        • Loosely coupled
        • Collaborative Services

        We develop the application by breaking down the entire application in independent smaller services that can be developed and delivered by individual team of developers

        We identify the business responsibilities and break them into independent services/projects which are built on REST architectural principles by independent teams and are deployable separately

        Benefits of developing application on microservices architecture

        1. They can be different team of developers can independently develop, test and delivery the system

        2. Each service we develop has a separate source code which can be understood easily by the developer and can maintain it

        •     – achieving modularity becomes easy
        •     – change impact will be very minimal
        •     – debugging the code becomes very easy

        3. Every team has their independent separate source code, they can get the code up quickly in an ide and can proceed for development

        4. The more the services/functionality we can broke down into multiple independent services can choose more teams to develop parallelly

        5. scalability

          In microservices based application as each services/module is being deployed separately we can achieve vertical scaling

        •   depends on the traffic patterns we can scale a specific module or service quickly rather than the entire system where the cost of scalability is very less as we are scaling a piece of the system
        •   we can customize the machine capacity based on the nature of the service like cpu oriented/memory-oriented services

        6. Adopting the new technologies can be really faster, either we can choose one of the services to be migrated out of the current system or we can build new services on the latest technology easily

        7. Easy to achieve ci/cd as every service is independent of the others, we can deliver a service without bothering about the others.

        Advantages of using microservices architecture?

        • 1. as the smaller codebase it is easy to manage within the ide.
        • 2. application servers are not overloaded and the application quickly gets started and debugging the application will not takes more time because of smaller code base
        • 3. scalability we can achieve vertical scalability in microservices
        •   – a specific module can be scaled up independent of the whole system based on the traffic/load
        •   – we can customize the computing aspects in scaling the service like cpu bounded and memory bounded
        • 4. we can adopt new technologies quickly, as services are development independently, we can migrate a service or develop new services are latest technologies
        • 5. easy to understand and develop the application as each service is built out of its independent code base developer often feel very easy in understanding the system. modularity can be achieved very easily. impact of a  change request is minimal and easy to manage. less complex and easy to maintain. debugging the application is going to be very easy.
        • 6. ci/cd can be adopted easily
        • 7. we can have multiple teams developing the system parallelly

        Conclusion:

        In the realm of software development, Spring Boot Microservices have emerged as a game-changer, offering agility, scalability, and modularity. We’ve explored the fundamental concepts behind microservices-based architecture and how they can reduce the complexity of software applications while enhancing delivery.

        However, it’s essential to remember that successfully implementing microservices goes beyond theoretical knowledge. To master this architectural style and unlock its full potential, you’ll need to dive into practical implementations and real-world examples.

        For More Practical Information:

        Fortunately, all the practical tutorials and hands-on guidance you need to embark on your microservices journey are right here on our blog. Explore our comprehensive tutorials, case studies, and examples to gain not only a theoretical understanding but also the practical skills to apply microservices effectively in your software projects.

        So, roll up your sleeves, venture into the world of microservices, and discover the transformative power they can bring to your software development endeavors—available exclusively on our blog!

        When it comes to implementing microservices architecture, it’s crucial to have a solid understanding of the key principles and best practices. For more in-depth insights and resources, I recommend visiting Microservices.io.