Development-Lifecycle-and-Deployment-Architect Practice Test Questions

226 Questions


Sales and Service products will be created by two teams that will use second-generation managed package(s). The Sales team will use a specific function of the Service product, but the architect wants to ensure that this team will only use the functions exposed by the Service team. No other team will use these same functions. What should an architect recommend?


A. Create two second generation managed packages with the same namespace and set the methods that should be shared with the @namespaceAccessible annotation.


B. Create two managed packages with Sales and service namespaces. Set the methods to be shared with the ©salesAccessible annotation


C. Create a managed package with both products and create a code review process with an approver from each team.


D. Create two managed packages. Create an authentication function in the Service package that will return a token if a Sales user is authorized to call the exposed function. Validate the token in the Service functions.





A.
  Create two second generation managed packages with the same namespace and set the methods that should be shared with the @namespaceAccessible annotation.

Explanation:

The architect should recommend creating two second generation managed packages with the same namespace and setting the methods that should be shared with the @namespaceAccessible annotation. This will allow the Sales team to access the specific functions of the Service product without exposing them to other teams or customers. Creating two managed packages with different namespaces will not allow the Sales team to access the Service functions, unless they are declared as global, which will expose them to everyone. Creating a managed package with both products will not allow the separation of the products and the control of the functions. Creating an authentication function in the Service package will add unnecessary complexity and overhead to the solution.

Metadata API supports deploy () and retrieve () calls for file-based deployment. Which two scenarios are the primary use cases for writing code to call retrieve () and deploy () methods directly? Choose 2 answers


A. Team development of an application in a Developer Edition organization. After completing development and testing, the application is Distributed via Lightning Platform AppExchange.


B. Development of a custom application in a scratch org. After completing development and testing, the application is then deployed into an upper sandbox using Salesforce CLI(SFDX)


C. Development of a customization in a sandbox organization. The deployment team then utilize the Ant Migration Tool to deploy the customization to an upper sandbox for testing.


D. Development of a custom application in a sandbox organization. After completing development and testing, the application is then deployed into a production organization using Metadata API.





A.
  Team development of an application in a Developer Edition organization. After completing development and testing, the application is Distributed via Lightning Platform AppExchange.

D.
  Development of a custom application in a sandbox organization. After completing development and testing, the application is then deployed into a production organization using Metadata API.

Explanation:

The Metadata API is mainly used for file-based deployment, such as deploying an application from a Developer Edition org to the AppExchange, or from a sandbox org to a production org. The Ant Migration Tool is a wrapper around the Metadata API, so it is not a direct use case for writing code to call retrieve() and deploy() methods. The Salesforce CLI (SFDX) uses the Source-Driven Development model, which relies on the source code as the source of truth, rather than the Metadata API.

Ursa Major Solar (UMS) has used Aura components significantly in its Salesforce application development. UMS has established a robust test framework and the development team follows the Salesforce recommended testing practices. UMS team uses Salesforce’s test tool To check for common accessibility issues. In which two environments the UMS team can call Aura accessibility tests? Choose 2 answers


A. JSTEST


B. ACCTEST


C. WebDriver Test


D. AuraDriver Test





A.
  JSTEST

C.
  WebDriver Test

Explanation:

Aura accessibility tests can be called in JSTEST and WebDriver Test environments. JSTEST is a JavaScript testing framework that runs on Node.js and can be used to test Aura components. WebDriver Test is a Selenium-based testing framework that can be used to test the user interface and accessibility of Aura components. ACCTEST and AuraDriver Test are not valid environments for calling Aura accessibility tests.

As a part of technical debt cleanup project, a large list of metadata components has been identified by the business analysts at Universal Containers for removal from the Salesforce org. How should an Architect manage these deletions across sandbox environments and production with minimal impact on other work streams?


A. Generate a destructivechanges.xml file and deploy the package via the Force.com Migration Tool


B. Perform deletes manually in a sandbox and then deploy a Change Set to production


C. Assign business analysts to perform the deletes and split up the work between them


D. Delete the components in production and then refresh all sandboxes to receive the changes





A.
  Generate a destructivechanges.xml file and deploy the package via the Force.com Migration Tool

Explanation:

A is the correct answer, as generating a destructivechanges.xml file and deploying the package via the Force.com Migration Tool is the best way to manage the deletions of metadata components across sandbox environments and production with minimal impact on other work streams. A destructivechanges.xml file is a special file that specifies the components to be deleted from an org, and can be deployed using the Force.com Migration Tool, which is a command-line tool that uses the Metadata API to retrieve and deploy metadata components. This method can help to automate and streamline the deletion process, as well as ensure consistency and accuracy across the environments. B is incorrect, as performing deletes manually in a sandbox and then deploying a change set to production is not a good way to manage the deletions, as it can introduce errors and inconsistencies, as well as require additional steps and permissions. C is incorrect, as assigning business analysts to perform the deletes and splitting up the work between them is not a good way to manage the deletions, as it can create confusion and complexity, as well as lack of coordination and integration.

D is incorrect, as deleting the components in production and then refreshing all sandboxes to receive the changes is not a good way to manage the deletions, as it can disrupt the production environment and the ongoing development and testing activities in the sandboxes. You can learn more about this topic in the Deploy Changes with the Force.com Migration Tool unit on Trailhead.

Which are the two key benefits of fully integrating an agile issue tracker with software testing and continuous integration tools? Choose 2 answers?


A. Developers can see automated test statuses that commit on a specific user story.


B. Developers can collaborate and communicate effectively on specific user stories.


C. Developers can observe their team velocity on the burn chart report in the agile tool.


D. Developers can use the committed code's build status directly on the user story record.





A.
  Developers can see automated test statuses that commit on a specific user story.

D.
  Developers can use the committed code's build status directly on the user story record.

Explanation:

Integrating an agile issue tracker with software testing and continuous integration tools can provide the following benefits: Developers can see automated test statuses that commit on a specific user story, which can help them identify and fix any errors or failures quickly. Developers can use the committed code’s build status directly on the user story record, which can help them track the progress and quality of their work.

When replacing an old legacy system with Salesforce, which two strategies should the plan consider to mitigate the risks associated with migrating data from the legacy system to Salesforec? Choose 2 answers?


A. Identify the data relevant to the new system, including dependencies, and develop a plan/scripts for verification of data integrity.


B. Migrate users in phases based on their functions, requiring parallel use of legacy system and Salesforce for certain period of time.


C. Use a full sandbox environment for all the systems involved, a full deployment plan with test data generation scripts, and full testing including integrations.


D. Use a full sandbox environment and perform test runs of data migration scripts/processes with real data from the legacy system.





A.
  Identify the data relevant to the new system, including dependencies, and develop a plan/scripts for verification of data integrity.

D.
  Use a full sandbox environment and perform test runs of data migration scripts/processes with real data from the legacy system.

Explanation:

Identifying the relevant data and verifying the data integrity can help ensure the quality and accuracy of the migrated data. Using a full sandbox and performing test runs with real data can help validate the migration process and identify any issues or risks.

A technical lead is performing all code reviews for a team and is finding many errors and improvement points. This is delaying the team’s Deliveries. Which two actions can effectively contribute to the quality and agility of the team? Choose 2 answers


A. Choose the most senior developer to help the technical lead in the code review.


B. Create development standards and train teams in those standards.


C. Skip the code review and focus on functional tests and UAT.


D. Use static code analysis tool in the pipeline before manual code review.





B.
  Create development standards and train teams in those standards.

D.
  Use static code analysis tool in the pipeline before manual code review.

Explanation:

The two actions that can effectively contribute to the quality and agility of the team are: Define and follow code standards, and use static code analysis tool in the pipeline before manual code review. Code standards can help ensure consistency, readability, and maintainability of the code, as well as reduce errors and bugs. A static code analysis tool can help automate the code review process and identify any issues or violations of the code standards before the manual review. Choosing the most senior developer to help the technical lead or skipping the code review are not effective actions, as they can lead to more errors and delays.

There are many types of quality assurance techniques that can help minimize defects in software projects. Which two techniques should an architect recommend, for Universal Containers to incorporate into its overall CI/CD pipeline? Choose 2 answers


A. Business verification testing


B. Stress testing


C. Automated browser testing


D. Static code quality analysis





C.
  Automated browser testing

D.
  Static code quality analysis

Explanation:

Automated browser testing and static code quality analysis are two quality assurance techniques that can help minimize defects in software projects, and that an architect should recommend for Universal Containers to incorporate into its overall CI/CD pipeline. Automated browser testing is a technique that involves using tools or frameworks to simulate user interactions with the web application across different browsers and devices, and to verify the functionality and performance of the application. Static code quality analysis is a technique that involves using tools or frameworks to scan the code and detect any violations of the predefined coding rules and best practices, such as syntax errors, security issues, code smells, etc. Business verification testing and stress testing are also quality assurance techniques, but they are not as suitable or relevant for the CI/CD pipeline, as they are more focused on validating the business requirements and the system capacity.

Universal Containers (UC) maintains its Salesforce org using its internal tools and processes for managing its application lifecycle. The UC team has been facing challenges on their development processes in their recent two releases. The architect has recommended the UC team to follow the org development model to address the challenges faced. Which two characteristics of the org development model will help UCaddress the challenges? Choose 2 answers


A. Automated deployment


B. Automated defect fixing


C. Automated sandbox provisioning


D. Automated change tracking





A.
  Automated deployment

C.
  Automated sandbox provisioning

Explanation:

The org development model is a traditional approach that uses sandboxes as the primary development environments. It relies on tools such as change sets, the Ant Migration Tool, or the Metadata API to deploy changes between orgs. One of the benefits of this model is that it allows automated deployment, meaning that the deployment process can be scripted and executed without manual intervention. This can save time and reduce errors. Another benefit of this model is that it allows automated sandbox provisioning, meaning that the creation and configuration of sandboxes can be done programmatically using the Sandbox API or the Salesforce CLI. This can help maintain consistency and alignment across different environments. Automated defect fixing and automated change tracking are not characteristics of the org development model, but rather of the package development model, which uses source code as the source of truth and supports source tracking and automated testing.

Universal Containers has a highly integrated environment with significant process orchestration between systems. When refreshing UAT, Objects that have external Ids from Production no longer point to valid External Ids in the UAT environment. What should an Architect do to resolve this?


A. Let UAT point to production integrations and rollback each transaction after it finishes.


B. Delete all the data and use an Automated testing tool to create new data across all the systems in UAT.


C. Mask the External Id so nobody can see the production value.


D. In the post refresh plan, modify external ids to a known valid set of values for UAT.





D.
  In the post refresh plan, modify external ids to a known valid set of values for UAT.

Explanation:

In the post refresh plan, modifying external ids to a known valid set of values for UAT is the best way to resolve the issue of objects that have external ids from production no longer pointing to valid external ids in the UAT environment. This way, the data integrity and consistency across the integrated systems can be maintained. Letting UAT point to production integrations and rolling back each transaction after it finishes is not a good practice, as it can cause data loss or corruption in production. Deleting all the data and using an automated testing tool to create new data across all the systems in UAT is not feasible, as it can take a lot of time and resources. Masking the external id so nobody can see the production value is not a solution, as it does not address the underlying problem of invalid references.

Universal Containers (UC) is considering updating their Salesforce Release Management process. Which three best practices should UC consider for Release Management? Choose 3 answers


A. Design the right sandbox strategy for the release.


B. Release sign-off is only required for Production.


C. Regression testing is mandatory for each release.


D. Maintain a pre/post deployment checklist for each release.


E. Publish a release calendar for each phase of the release.





A.
  Design the right sandbox strategy for the release.

C.
  Regression testing is mandatory for each release.

D.
  Maintain a pre/post deployment checklist for each release.

Explanation:

Designing the right sandbox strategy for the release is a best practice, as it helps to ensure the quality and consistency of the code/configuration across different environments. Regression testing is mandatory for each release, as it helps to verify that the existing functionality is not broken by the new changes. Maintaining a pre/post deployment checklist for each release is a best practice, as it helps to track the tasks and dependencies for each deployment. Release sign-off is not only required for Production, but also for other environments such as UAT and Staging. Publishing a release calendar for each phase of the release is not a best practice, as it may change due to unforeseen circumstances and create confusion.

What are three advantages of the package development model? Choose 3 answers


A. Improving team development and collaboration.


B. Eliminating the need of using change set, which should no longer be used as it can get messy working with package development models.


C. Facilitating automated testing and continuous integration.


D. Significantly reducing the need for manually tracking changes.


E. Providing its own source control, so the source can be deployed In any sandbox orgs.





A.
  Improving team development and collaboration.

C.
  Facilitating automated testing and continuous integration.

D.
  Significantly reducing the need for manually tracking changes.

Explanation:

The advantages of the package development model are improving team development and collaboration, facilitating automated testing and continuous integration, and significantly reducing the need for manually tracking changes. The package development model allows the developers to work on modular and reusable components that can be easily tested and deployed. The package development model does not eliminate the need of using change sets, as they can still be used for deploying non-packaged components or metadata. The package development model does not provide its own source control, but rather relies on external source control systems such as Git.


Page 7 out of 19 Pages
Previous