Tuesday, December 28, 2021

Batch Apex

Batch Apex is asynchronous execution of Apex code, specially designed for processing the large number of records and has greater flexibility in governor limits than the synchronous code.

Need of Batch Apex:

  • When you want to fetch thousands of records or fire DML on thousands of rows on objects it is very complex in salesforce and it does not allow you to operate on more than certain number of records which satisfies the Governor limits.
  • But for medium to large enterprises, it is essential to manage thousands of records every day. Adding/editing/deleting them when needed.
  • Salesforce has come up with a powerful concept called Batch Apex. Batch Apex allows you to handle more number of records and manipulate them by using a specific syntax.

Database.Batchable interface has the following three methods that need to be implemented.

start() :

It collects the records or objects to pass to the interface method execute(), call the start() at the beginning of a BatchApexJob. This method returns either a Database.QueryLocator object that contains the records passed to the job.

execute() :

To do the required processing of each chunk of data, use the execute method. This method is called for each batch of records that you pass to it. This method takes a reference to the Database.BatchableContext object.

finish() :

To send confirmation emails or execute post-processing operations, we use finish(). This method is called after all batches are processed.

Note: The order of execution of batches is not guaranteed.

While Implementing the Batch Programming, we have to follow the below steps.

	Step 1: Create a Global Class, which should be implemented by the
		"Database.Batchable" Interface.
			
		Syntax:
			Global Class <ClassName> Implements Database.Batchable<SObject>
			{
				// Write the Logic..
			}
			
	Step 2: Provide the Implementation for the Interface Methods.
		
		Syntax:
			Global Class <ClassName> Implements Database.Batchable<SObject>
			{
				Global Database.QueryLocator Start(Database.BatchableContext <refName>)
				{
					// Write the Start Method Logic..
				}
				
				Global void Execute(Database.BatchableContext <refName>, List<SObject>
																recordsToProcess)
				{
					// Write the Execute Method Logic..
				}
				
				Global void Finish(Database.BatchableContext <refName>)
				{			
					// Write the Finish Method Logic..
				}
			}

	Step 3: Invoke the Batch Class.
			
			Step 1: Create the Object of the Batch Class.
				Ex:
					<BatchClassName> <objectName> = new <BatchClassName>();
					
			Step 2: Invoke the Batch Class by using "Database.ExecuteBatch()" method.
				Ex:
					ID jobId = Database.ExecuteBatch(<batchClassObjectName>);
					
								(OR)
								
		ID jobId = Database.ExecuteBatch(<batchClassObjectName>, batchSize);
					
			Ways to Invoke the Batch Class:
			-------------------------------
			1. We can Invoke from Execute Anonymous Window.
			2. We can invoke from Another Batch Class.
			3. We can invoke from "Visualforce Page".
			4. We can Schedule the Batch Job.

	Step 4: Track the Status of the Batch Class.
			
			Case 1: Track the Status from the Setup Wizard.
					Setup --> Monitor --> Jobs --> Apex Jobs.
					
			Case 2: Get the Status through Programming by Querying from "AsyncApexJob"
					object.
					
					AsyncApexJob jobDetails = [Select id, status, totalJobItems,							jobItemsProcessed,numberOfErrors, CreatedBy.Email from AsyncApexJob
														Where id =: <batchJobId> ];


Tuesday, November 23, 2021

Contract Testing

 Contract testing is a methodology for ensuring that two separate systems (such as two microservices) are  compatible and are able to communicate with one other.

 It captures the interactions that are exchanged between each service, storing them in a contract, which can then be used to verify that both parties adhere to it.

Contract testing goes beyond schema testing, requiring both parties to come to a consensus on the allowed set of interactions and allowing for evolution over time.

Pact is a code-first consumer-driven contract testing tool, and is generally used by developers and testers who code. The contract is generated during the execution of the automated consumer tests.

A major advantage of this pattern is that only parts of the communication that are actually used by the consumer(s) get tested. 

This in turn means that any provider behaviour not used by current consumers is free to change without breaking tests.

Saturday, September 25, 2021

Keycloak

 Keycloak is an open source Identity and Access Management solution targeted towards modern applications and services.

Keycloak offers features such as Single-Sign-On (SSO), Identity Brokering and Social Login, User Federation, Client Adapters, an Admin Console, and an Account Management Console.

Below are keycloak features:

1)Multiple protocols support

2)SSO support

3)Offers Web based GUI

4)External Identity Source Sync

In case when your client currently has some type of user database, Keycloak allows us to synchronize with such database. By default, it supports LDAP and Active Directory but you can create custom extensions for any user database using Keycloak User storage API.

4)Identity Brokering

Keycloak can also work as a proxy between your users and some external identity provider or providers. Their list can be edited from Keycloak Admin Panel.

5)Social Identity Providers

Additionally, Keycloak allows us to use Social Identity Providers. It has built-in support Google, Twitter, Facebook, Stack Overflow but, in the end, you have to configure all of them manually from admin panel. 

6)Customizations

Currently  Keycloak supports following distributions.

1)server

2)Docker Image

3)Operator

Link: https://www.keycloak.org/

Saturday, July 31, 2021

Core Banking

 Core banking is a banking service provided by a group of networked bank branches where customers may access their bank account and perform basic transactions from any of the member branch offices.

Core banking systems typically include deposit, loan and credit-processing capabilities, with interfaces to general ledger systems and reporting tools.

Open source Technology in core banking solution or software can help banks to maintain their productivity and profitability at the same time.

Below are some of the Core Banking Software's in market:

  • CRMNEXT. 
  • Fisa Group. 
  • Finastra
  • Turnkey Lender
  • Q2eBanking
  • nCino
  • Temenos

Wednesday, May 26, 2021

Confluent Hub Client

Confluent Hub is an online repository for extensions and components for Kafka. Kafka is based on extensible model for many of its services. 

It allows plug-ins and extensions which makes it generic enough to be suitable for many real world streaming based applications. They include both Confluent and 3rd party components. 

Generally, you would go there to search for components including:

  • Connectors
  • SMT (Single Message Transforms)
  • Converters
Installing Connectors from Confluent Hub:

The enterprise version of Confluent provides a script for installing Connectors and other components from Confluent Hub (the script is not included in the Open Source version).

If we're using the enterprise version, we can install a connector using the following command:

$CONFLUENT_HOME/bin/confluent-hub install confluentinc/kafka-connect-mqtt:1.0.0-preview

More no of connectors can be found under below link:

Tuesday, May 25, 2021

Access Bitbucket using python

Bitbucket is a Git-based source code repository hosting service owned by Atlassian.

Bitbucket Server is a combination Git server and web interface product written in Java and built with Apache Maven.

It allows users to do basic Git operations  while controlling read and write access to the code. It also provides integration with other Atlassian tools.

Now a days for reporting and for ETL operations, demand for python is increasing.

Using Python request library we can access bitbucket.

Below is sample code snippet.

import requests
import json
import pandas as pd
import io
url = 'https://api.bitbucket.org/2.0/repositories/Abcd'
headers = {'Content-Type': 'application/json'}
USERNAME = 'xxxxxx'
PASSWORD = 'yyyyyy'
response = requests.get(url, auth=(USERNAME, PASSWORD), headers=headers)
if response.status_code != 200:
print('Status:', response.status_code, 'Headers:', response.headers,
'Error Response:', response.json())
exit()

df = pd.read_json(io.StringIO(response.text))
# j = a.assigng().assigng()

dk = pd.json_normalize(df['values'])
dk.to_excel('Bitbucket_INC_Report.xlsx', sheet_name='SLA_Report', index='False')

Sunday, May 16, 2021

Install and Use Gremlin in a Docker Container

Gremlin is a simple, safe and secure way to use Chaos Engineering to improve system resilience. You can use Gremlin with Docker in a variety of ways.

It is possible to attack Docker containers and it is also possible to run Gremlin in a container to create attacks against the host or other containers.

• Create a Gremlin account: https://www.gremlin.com/demo/

• Login to the Gremlin App using your Company name and sign-on

credentials.

• Identify “TeamID” and “Secret Key” by navigating to

Settings>>TeamSettings>>Configuration

• Issue below command to in docker to pull the official Gremlin Docker image

and run the Gremlin daemon.

docker run -d --net=host \

--cap-add=NET_ADMIN --cap-add=SYS_BOOT --cap-add=SYS_TIME \

--cap-add=KILL \

-v $PWD/var/lib/gremlin:/var/lib/gremlin \

-v $PWD/var/log/gremlin:/var/log/gremlin \

-v /var/run/docker.sock:/var/run/docker.sock \

-e GREMLIN_TEAM_ID="$GREMLIN_TEAM_ID" \

-e GREMLIN_TEAM_SECRET="$GREMLIN_TEAM_SECRET" \

gremlin/gremlin daemon

• Use docker ps to see all running Docker containers:

       sudo docker ps

• Jump into your Gremlin container with an interactive shell

     sudo docker exec -it <gremlin container_id> echo “Running”

• From within the container, check out the available attack types:

      gremlin help attack-container

ES12 new Features