Friday 29 January 2021

Topics to study for Identity and Access Management Designer

Hey, 

I have taken this exam in Sep 2019, and in my personal view, its by far - the most interesting certification among all the designer ones. As part of my write-ups series, I have been wanting to write about the learnings for this exam but didn't get a chance to post it for quite some time. Again, you'll find everything in Trailmix but 
I have consolidated the key topics/points which I've found really useful:
 
Salesforce supports the following identity solutions: 
  • SAML (Security Assertion Markup Language)
  • OAuth  (Open Authorization) 
  • OpenID Connect 
Difference between Authentication and Authorization: 
  • Authentication - Who can access the resource 
  • Authorization - What can we access from the resource 
OAuth 2.0 Components:

1) Roles / Actors 

2) Scope / Consent [Parameter use to limit the right of access resource]

3) Flows / Grants /Pattern 

4) Tokens

Access token — Short lived. After configuring an OAuth 2.0 connected app, generate an initial access token. Salesforce requires this token to authenticate the dynamic client registration request.

Refresh token — 
Long lived. A refresh token can have an indefinite lifetime, persisting for an admin-configured interval or until explicitly revoked. The client application can store a refresh token, using it to periodically obtain fresh access tokens.

To protect the token, the only hostname allowed with an HTTP callback URL is localhost. Other hosts must use HTTPS]

OAuth 2.0 Grant: 

There are total five OAuth 2.0 grant: 
  1. Authorization Code Grant 
  2. Implicit Grant 
  3. Resource Owner Grant 
  4. Client Credential Grant  [Machine to Machine - where client is the Resource Owner]
  5. Device Flow Grant 
Salesforce OAuth 2.0 Flow:


There are total nine Salesforce OAuth 2.0 flows. 

1) Web Server Flow (Secure server) | Grant type: Authorization Code


Apps hosted on a secure server use the web server authentication flow. A critical aspect of the web server flow is that the server must be able to protect the client secret. This flow uses an OAuth 2.0 authorization code grant type.

Some flows have important security considerations. For example, when using the web server flow, you must store the client secret securely.

Newer OAuth 2.0 apps using the Web Server flow are approved for more devices after the user has been granted access once. The User-Agent flow requires user approval every time.

Webserver Flow



2) User Agent Flow Grant type: Implicit Grant

We use this type of flow when there is no server, and that's why there's no refresh token - It doesn't share username and password. 

3) Device Authentication Flow | Grant Type: Device Flow 

4) Username and Password Flow | Grant Type: Resource Owner 

Where client is developer by the same authority as  Authorization server for implicit trust. Salesforce Communities don’t support the username-password authentication flow.

5) JWT Bearer Token Flow (Digital Encrypted Signed Certificate  - sub part of Web Server flow, User Agent, Device Authentication, UserName & Password)

If you’re using the JWT Bearer Token or SAML Bearer Assertion flows, select Use Digital Signatures and upload a signing certificate.

6) SAML Bearer Assertion Flow

Sub part of Web Server, User Agent, Device Authentication, UserName & Password flow - An app can reuse an existing authorization by supplying a signed SAML 2.0 assertion

7) SAML Assertion Flow

Sub part of Web Server, User Agent, Device Authentication, UserName & Password flow - All types of OAuth 2.0 flows, except for the SAML Assertion flow, require that you define a connected app.

This flow is an alternative for orgs that are using SAML to access Salesforce and want to access the web services API in the same way.

8) Asset Token Flow (Sub flow of Device Authentication for Connected Device)


9) Refresh Token Flow (Sub part of Webserver and User Agent flow) 



Identity URLs:

When a client app makes an authorization request that’s successful, the HTTP response contains an identity URL along with the access token. The URL is returned in the id scope parameter.

Federation ID isn’t owned by an interstellar shipping organization with nefarious designs. It’s basically a term that the identity industry uses to refer to a unique user ID.


Mobile-First Identity Salesforce Mobile-First Identity simplifies the login experience for your customers. When we say login experience, we're referring to all aspects of identity verification and authentication: from sign-up, to log in, to handling identity verification, to resetting passwords, to logging out.

Login Discovery doesn’t know how to verify the user until it determines—discovers—how the user is identified. Think of Login Discovery as a two-step process. Login Discovery supports both password and passwordless login.

Deep Linking - When we need to navigate directly to the page of external system without going to home page 

RelayState - RelayState parameter must be used, if there is a need to redirect the user to a specific page after the SSO is successful for the best possible user experience

Identity Provider: An identity provider (IdP) is a service that stores and verifies user identity.

Service Provider: The service provider initiates login by sending a SAML request to the identity provider, asking it to authenticate the user. 


SAML Assertion: 

A SAML assertion is an XML security token issued by an identity provider and consumed by a service provider.

In some cases, you want to authenticate servers without interactively logging in each time the servers exchange information. For these cases, you can use the OAuth 2.0 JSON


Difference between SP Initiated Vs IdP Initiated SSO

If SSO is initiated from login page of service provider than its service provider initiated SSO. If SSO is initiated via identity provider login page than its IP initiated. Salesforce can act as both service provider and identity provider, in case where we can access apps like Workday from App launcher of Salesforce - than its identity provider initiated SSO and there will be no redirects. 


Delegated Authentication SSO 


If we can't use Federated Authentication then we use delegated authentication, for e.g. browser redirect is not possible. In delegated authenticated SSO, we need to pass the responsibility to 3rd party system via custom web service, and the service can return true/false. Its additional work and it is not ideal, because the username and password needs to be passed to the API.  



Useful Links





Tuesday 5 May 2020

Salesforce Work.com Capabilities

Salesforce announces  Work.com to help companies and communities to safely reopen after lockdown. Here’s the summary of features and pricing:

Command Center: 360-degree view of return-to-work readiness across locations, employees and visitors, make data-driven decisions, take action and communicate effectively - GA in June 2020

Contact Tracing: Ability to manually trace health and relationship contacts in a safe and private manner by data from individuals who are infected or potentially exposed to an infectious disease - GA in May 2020

Emergency Response Management: To manage all types of emergencies, deliver care to those affected and allocate resources and services quickly - GA in May 2020

Employee Wellness: To monitor and analyze employee and visitor health and wellness - GA in June 2020

Shift Management: To orchestrate the eventual return of employees to the office through shift management capabilities that can help reduce office density - GA in June 202

myTrailhead for Employees: To  help employees skill up on new ways of working with out-of-the-box training, learning and wellness programs - GA in June 2020

Volunteer & Grants Management: To help organizations fulfill their relief goals with flexible, scalable tools that streamline volunteer coordination and grant-making processes

Pricing: (All the features are included as add-ons of Salesforce license)


  • Command Center and Shift Management + Employee Wellness  - Starting from $5 per user per mont
  • Emergency Response Management Contact Tracing - Add-on starting from $50 per user per mont
  • myTrailhead for Employees -  $25 per user per month.


Thursday 10 October 2019

Topics to study for Integration Architecture Designer exam


I have recently taken my fourth designer exam ( only one to go... yay !!). Overall the exam wasn't really difficult, if you have relevant hands on integration experience. Trailmix has everything what you need to study for this exam so I have listed the key topics/points which I've found really useful: 

Integration Patterns
Integration Patterns are classified into two categories:
Data Integration
Process Integration
These patterns address the requirement to synchronise data that resides in two
or more systems so that both systems always contain timely and meaningful data
The patterns in this category address the need for a business process to leverage two or more applications to complete its task
Here's the list of patterns:


Pattern Selection Matrix 


Introduction to APIs

Here's the list of API with key details.

  • Rest API: Its advantages include ease of integration and development, and it’s an excellent choice of technology for use with mobile applications and web projects.
  • SOAP API: provides a powerful, convenient, and simple SOAP-based web services interface for interacting with Salesforce. 
  • Chatter REST API: Use Chatter REST API to display Chatter feeds, users, groups, and followers, especially in mobile applications.
  • User Interface API: Build Salesforce UI for native mobile apps and custom web apps using the same API that Salesforce uses to build Lightning Experience and Salesforce for Android, iOS, and mobile web.
  • Use Apex REST API: when you want to expose your Apex classes and methods so that external applications can access your code through REST architecture.
  • Use Apex SOAP API: when you want to expose Apex methods as SOAP web service APIs so that external applications can access your code through SOAP.
  • Analytics REST API: You can access Analytics assets—such as datasets, lenses, and dashboards—programmatically using the Analytics REST API.
  • Bulk API is based on REST principles and is optimized for loading or deleting large sets of data. 
  • Use Metadata API: to retrieve, deploy, create, update, or delete customizations for your org. 
  • Use Streaming API: to receive near-real-time streams of data that are based on changes in Salesforce records or custom payloads. 
  • Use Tooling API: to integrate Salesforce metadata with other systems
Salesforce APIs



Choreography Vs Orchestration

The difference between choreography and service orchestration is:

Choreography
Orchestration
Choreography (where applications are multi-participants and there is no central “controller”) can be defined as “behaviour resulting from a group of interacting individual entities with no central authority.”
Orchestration (where one application is the central “controller”)  can be defined as “behaviour resulting from a central conductor coordinating the behaviours of individual entities performing tasks independent of each other.

Streaming Events 

Use the type of streaming event that suits your needs.

PushTopic Event
Receive changes to Salesforce records based on a SOQL query that you define. The notifications include only the fields that you specify in the SOQL query.

Change Data Capture Event

Receive changes to Salesforce records
with all changed fields. Change Data
Capture supports more standard objects than PushTopic events and provides more features, such as header fields that contain information about the change. Change Data Capture is part of a pilot program. To participate in the pilot, contact Salesforce.

Platform Event

Publish and receive custom payloads with a predefined schema. The data can be anything you define, including business data, such as order information. Specify the data to send by defining a platform event. Subscribe to a platform event channel to receive notifications.

Generic Event

Publish and receive arbitrary payloads without a defined schema.


WSDL (Webservice Definition Language) 

Enterprise WSDL

  • The Enterprise WSDL is strongly typed.
  • The Enterprise WSDL is tied (bound) to a specific configuration of Salesforce (ie. a specific organization's Salesforce configuration).
  • The Enterprise WSDL changes if modifications (e.g custom fields or custom objects) are made to an organization's Salesforce configuration.

Partner WSDL

  • The Partner WSDL is loosely typed.
  • The Partner WSDL can be used to reflect against/interrogate any configuration of Salesforce (ie. any organization's Salesforce configuration).
  • The Partner WSDL is static, and hence does not change if modifications are made to an organization's Salesforce configuration.

Delegated Authentication WSDL

The delegated authentication WSDL document is for users who want to created a delegated authentication application to support single-sign on.

Metadata WSDL

The Metadata WSDL document is for users who want to use the Metadata API to retrieve or deploy customization information,

Apex WSDL

The Apex WSDL document is for developers who want to run or compile Apex scripts in another environment.

Identity Type

Determines whether you're using one set or multiple sets of credentials to access the external system 

  • Anonymous — No Identity and therefore no authentication 
  • Named Principal—Your entire Salesforce org shares one login account on the external system. Salesforce manages all authentication for callouts that specify a named credential as the callout endpoint so that you don’t have to. You can also skip remote site settings, which are otherwise required for callouts to external sites, for the site defined in the named credential.
  • Per User—Your org uses multiple login accounts on the external system. You or your users can set up their personal authentication settings for the external system 

Certificates
  • The API client certificate is used by workflow outbound messages, the AJAX proxy, and delegated authentication HTTPS callouts.
  • Certificates with 2048-bit keys last one year and are faster than certificates with 4096-bit keys. Certificates with 4096-bit keys last two years. You can have a maximum of 50 certificates.
  • The expiration date of the certificate record is updated to the expiration date of the newly uploaded certificate.
  • Don't delete a key unless you're absolutely certain no data is currently encrypted using the key. After you delete a key, any data encrypted with that key can no longer be accessed.
  • Replace the Default Proxy Certificate for SAML Single Sign-On

Reason for backup

  • Recover from data corruption (unintended user error or malicious activity)
  • Prepare for a data migration rollback
  • Archive data to reduce volumes
  • Replicate data to a data warehouse/BI
  • Take snapshots of development versions
  • Vertical optimization: Backup time is, among other parameters, proportional to the number of records you are retrieving. Partial backup is also a type of vertical optimization.
  • Horizontal optimization: Backup time is, among other parameters, proportional to the number and types of columns you are retrieving. 

Sandboxes

  • A full sandbox copies the production org but using it as a DRP (Disaster Recovery Plan) is not recommended, neither as an alternative production environment (because the related infrastructure is not meant for production usage) nor as a backup (because there’s no guarantee of data integrity of the copy, and copy is not point-in-time).
  • Full org copy should also not be used as a substitute to Salesforce Org Sync

Salesforce Developer Limits

  • Formulas: maximum length 3,900 characters
  • Lightning pages: maximum components in a region - 25
  • Master-detail relationship: 8 maximum child records  - 10,000 
  • Recycle Bin: maximum records:  25 times your MB storage capacity as records. For example, an org with a storage allocation of 2,000MB (2GB) can have 50,000 records in the Recycle Bin: 25 x 2,000 = 50,000 records.
  • Each flow can have up to 50 versions and 2,000 steps 
  • Each org can have up to 500 active flows, 1,000 flows total, 30,000 waiting interviews at a given time, 1,000 events processed per hour and 20,000 defined relative alarm events across all flows and flow versions
  • If a file exceeds the maximum size, the text within the file isn't searched
  • PDF .pdf 25 MB and Word .doc, .docx, .docm 25 MB
  • The maximum number of times a file can be shared is 10
  • File storage and data storage are calculated asynchronously, so if you import or add a large number of records or files, the change in your org’s storage usage isn’t reflected immediately.
  • On-Demand Email-to-Case: Number of user licenses multiplied by 1,000; maximum 1,000,000
  • Total number of SOQL queries issued 100 Total number of records retrieved by SOQL queries 50,000 Total number of DML statements issued 150 17 Salesforce Application Limits Process Automation Limits Description Per-Transaction Limit Total number of records processed as a result of DML statements 10,000
  • Total number of characters in a process name 255 Total number of characters in a process’s API name 77 Total number of versions of a process 50 Total number of criteria nodes in a process 200
  • The daily limit for emails sent through email alerts is 1,000 per standard Salesforce license per org. The overall org limit is 2,000,000. 
  • These limits count for each Apex transaction. For Batch Apex, these limits are reset for each execution of a batch of records in the execute method.
  • This limit doesn’t apply to custom metadata types. In a single Apex transaction, custom metadata records can have unlimited SOQL queries
  • API Calls : Total Calls Per 24-Hour Period : up to a maximum of 1,000,000
  • You can submit up to 10,000 batches per rolling 24-hour period
  • A batch can contain a maximum of 10,000 records
  • Batches are processed in chunks. The chunk size depends on the API version. In API version 20.0 and earlier, the chunk size is 100 records. In API version 21.0 and later, the chunk size Batch processing time is 200 records. 
  • There’s no limit on sending individual emails to contacts, leads, person accounts, and users in your org directly from account, contact, lead, opportunity, case, campaign, or custom object pages.
  • Using the API or Apex, you can send single emails to a maximum of 5,000 external email addresses per day
  • Change sets: Inbound and outbound change sets can have up to 10,000 files of metadata.
  • In each specified relationship, no more than five levels can be specified in a child-to-parent relationship. For example, Contact.Account.Owner.FirstName (three levels).
  • Big objects: don’t support the following operators. – !=, LIKE, NOT IN, EXCLUDES, and INCLUDES
  • Visualforce Limit: Maximum rows retrieved by queries for a single Visualforce page request 50,000
  • Maximum records that can be handled by StandardSetController 10,000
  • Maximum collection items that can be iterated in an iteration component such as 1,000 and 

Useful Links:

Middleware Terms and Definitions

Integration Patterns Overview

Wednesday 11 September 2019

Salesforce Einstein Analytics Capabilities

KEY TAKEAWAYS:

Einstein Analytics Product (Wave + Prediction Builder + Discovery):


Einstein Analytics in a simplest form is a combination of Wave Analytics, Einstein Prediction Builder and Einstein Discovery

Wave Capability - Pull data from different sources. Only change is; now we can join different tables without using code. Earlier, we used to write SAQL (Salesforce Analytics Query Language) to do inner joins.

Prediction Builder Capability -  It predict the trend as number  (what’s the likelihood of something to happen in percentage) same as Einstein Prediction Builder.  The key difference is; in Einstein Prediction Builder, there’s a limitation that we can’t do cross object predictions, now this limitation is gone because the source of data is Wave engine

Discovery Capability - It tells the narrative (story) of data same as Einstein Discovery, the only difference again is data behind the scene can be enriched as its coming from Wave engine

Einstein Analytics Flow:

·       Data finds the customer  (Data coming from different sources)
·       Salesforce add narrative to that data (Add story around that data)
·       Predict what’s happening  (What happened in the past)
·       Predict what’s likely to happen (What’s going to happen in future, based on trends)
·       Do actions based on insights (Perform actions to change insights in your favour)


Einstein Analytics Assets:

·       App - It’s a collection of dashboards.
·       Dashboard - It’s a collection of lens (reports)
·       Lens - It’s a report, which can use of target dataset
·       Target Dataset - It’s a dataset, which can be used to create lens. It can be created using multiple recipe
·       Recipe - It’s simply a saved set of transformations, or steps, that you want to perform on a specific source dataset
·       Source Dataset - Raw dataset coming from different source for e.g. Oracle ERP, NetSuite etc

Frequency of data refresh:

You can set the frequency of your data to be refreshed on the following basis:

·       Time Based (Minimum is hourly)
·       Action Based (Refresh data based on any action)

Einstein Analytics License Structure

License Type     
Storage
Artificial Intelligence
Business Intelligence
Einstein Analytics Plus
1 Billion Rows  
Yes
Yes
Growth
1 Million Rows
No
Yes
Predictions
N/A
Yes
No

Random Points:


·       Minimum data to do analysis should be 5000 records.
·       Not all use cases can be achievable using Einstein Analytics. As of now, it does perditions only in numbers for e.g. (% of likelihood to happen something)
·       As of now, we can’t change the Machine Learning algorithm under the hood, but in future we might be able to plug in our own ML algorithm if we need to.
·       We can launch any action from dashboard which we’ve defined in platform for e.g. trigger, quick actions etc
·       You can define data cleaning rules and can reuse to enrich data periodically
·       Einstein Analytic runs on separate cluster from platform, it copies the data from different sources on its own cluster to store it as big flat file. We can add condition on copying the data, if we don’t want to copy over everything.