0 comments on “Install Sitecore 9.2 from Scratch”

Install Sitecore 9.2 from Scratch

Introduction

In this post I’ll provide you with instructions to install Sitecore 9.2 from scratch. This article is aimed at developers who may never have installed Sitecore before and need to work through all the prerequisites before installing Sitecore. It also takes the simplest approach possible to get a working installation of Sitecore 9.2. Sitecore 9.2 includes a Graphical User Interface (GUI) that turns the actual installation into a wizard driven process that is much easier than earlier versions of Sitecore 9.x

The starting point is a clean updated installation of Windows 10 Professional. I have also installed Firefox and Edge Developer Trial. This is the Beta Chromium based version of Edge.

Prerequisites

You are going to need the following programs in order to proceed, and instructions are provided below on how to access and install them.

  • IIS
  • Java
  • Solr
  • SQL Server
  • Visual Studio and Sitecore Rocks
  • Sitecore license or Sitecore Developer Trial license (60 days access)

IIS

The first thing to do is turn on IIS. Go to the Control Panel -> Programs and Features and click on “Turn Windows features on or off” Locate Internet Information Services in teh Windows Features popup and click the checkbox. It will not show a tick, just a square to indicate that not all features are enabled by default

The default settings will be fine for our needs.

Open a browser and enter http://localhost you should see the IIS Welcome splash screen.

Java

Java is required to run Solr so if it is not installed you will need to download and install.  Download the latest version here: https://www.java.com/en/download/windows-64bit.jsp Once the download has completed run the .exe file to install Java

Solr

Download Solr-x.x.x from Apache https://www.apache.org/dyn/closer.lua/lucene/solr/8.2.0/solr-8.2.0.zip Extract the zip to the root of your C drive..  You can check that Solr has ‘installed’ correctly and is working by starting it from an elevated command prompt.  Elevated is when you are running the command prompt as Administrator, right click on the command and select Run as administrator. Navigate to the bin folder that is in the Solr folder. Run the command solr start -p 8984

You should now be able to access the Solr Admin page by entering http://localhost:8984 in your browser of choice. You should see:

Assuming your installation of Solr is working correctly, stop Solr from the command prompt using solr stop -all

If you are working on a VM now is a good time to take a snapshot.

To work with Sitecore 9.2 Solr must use https and must be running as a service. This is going to involve creating a self-signed certificate and editing a configuration file to set up Solr to use https.

In an elevated PowerShell window run:

New-SelfSignedCertificate -CertStoreLocation cert:\LocalMachine\My -DnsName "localhost", "127.0.0.1" -FriendlyName "SolrCert" -NotAfter (Get-Date).AddYears(10)

Use the Cortana search bar to locate and open ‘Manage computer certificates’. Open the Personal -> Certificates folder and export the newly created certificate by right clicking on it and selecting ‘all tasks -> Export’ Select ‘Yes, export the private key’ and ‘Include all certificates in the certification path if possible’ Set a password, I am using the traditional ‘secret’.  Select a location and name for the exported certificate, for example ‘C:\solr-8.2.0\solr_ssl_cert.pfx’.  In the MMC navigate to ‘Trusted Root Certification Authorities -> Certificates and Import the newly exported certificate.

Edit ‘C:\solr-8.2.0\bin\solr.in.cmd‘ and uncomment and edit the HTTPS section so it looks like this (in my case, edit to suit your environment):

Use a command prompt to start Solr using the command solr start -f -p 8984. You should now be able to connect to the Solr Admin page using https://localhost:8984. Note that Firefox may give a warning because it is a self-signed certificate, you an disregard or add an exception. Stop Solr by pressing Ctrl C in the command prompt window.

Finally we need to set Solr to run as a service. Download NSSM (a great little free tool to create and manage Windows Services) from here: https://nssm.cc/download and extract the contents of the .zip file to an appropriate location; I used C:\ so my path to nssm is C:\nssm-2.24\win64\nssm.exe.

In an elevated command prompt run C:\nssm-2.24\win64\nssm install solr82 in the popup enter the path to solr.cmd and the arguments -f -p 8984. Click Install service. The -f argument makes the service run in the foreground so nssm has access to stop and start it. The -p sets the port.

As a final check that everything is working as expected restart the PC (The Solr service should start automatically) and connect to https://localhost:8984

SQL Server

Go to https://www.microsoft.com/en-au/sql-server/sql-server-downloads to download SQL Server, I am using 2017 Express. Download and launch the installer.  I left it on the standard options.  Install SQL Server Management Studio (SSMS), download from https://docs.microsoft.com/en-us/sql/ssms/download-sql-server-management-studio-ssms You should now be able to launch SSMS and connect to the SQL Server.

There is a little bit of configuration required before it will work with Sitecore though.  The default installation uses Windows Authentication and we need SQL Authentication enabled and the ‘sa’ account activated. Login to SQL Server using SSMS and right click on the Server to access the Server Properties.

On the Security tab ensure that SQL Server and Windows Authentication mode is selected.  Next go to Security -> Logins and open the Properties dialog for ‘sa’. Set a password and ensure that Login is enabled on the Status tab.  It will be necessary to restart the SQL Server service before the changes take effect.  This can be done easily using the Services app:

Finally, we need to allow Contained Database Authentication.  Click New Query and run:

sp_configure 'contained database authentication', 1;

reconfigure;

Visual Studio & Rocks

Download and install Visual Studio, I am using the 2019 Community edition obtainable from https://visualstudio.microsoft.com/vs/

Install Sitecore Rocks extension.  Use the Extensions -> Manage Extensions tool to add Sitecore Rocks.  Restart Visual Studio.  You should now be able to locate Rocks under Extensions:

Sitecore 9.2

Sitecore 9.2 ships with a Graphic User Interface (GUI) that can be used to make the installation process simple if you do not need to customise the installation.  We will be using the GUI for our installation .

You will need permission to access the Sitecore Developer site (https://dev.sitecore.net) and download assets. You will also need a valid Sitecore license to be able to install and use Sitecore. If this is an issue for you consider joining the excellent Sitecore Developer Trial Program. This free trial will allow you to use Sitecore for 60 days.

If have not already, download Sitecore 9.2.0 rev. 002893 (Setup XP0 Developer Workstation rev. r150).zip from dev.sitecore.net, extract the zip into a folder, I would suggest C:\Installer, for example, referred to as the installation folder.  Copy your license into this folder.

Because we are going to use the GUI to install Sitecore most of the needed values are entered during the installation, however, if we want to install Habitat in our new instance of Sitecore there is one small edit required. Habitat is a demo site that follows the Sitecore best practices described in Helix (https://helix.sitecore.net/)

In the installation folder open setup.exe.config in a text editor. Alter the line <parameter name="SitecoreSiteName" value="{Prefix}sc.dev.local /> so that it reads <parameter name="SitecoreSiteName" value="{Prefix}.dev.local /> this will make our site name match the name expected by Habitat.

In the installation folder right click on setup.exe and select Run as Administrator.  Do not skip the Prerequisites stage, it is required to install SIF (if needed) and several Windows and .Net components that are not part of the default Windows installation.  If you run the Sitecore GUI installer again you can skip this step.  Now it should be just a case of entering the values requested by the wizard and allowing it to install Sitecore. 

In my environment the Summary screen looked like:

That’s it! You have now installed Sitecore 9.2 and you are ready to start exploring this great tool.

0 comments on “SXA Speedy – Supercharge your SXA Page Speed Scores in Google”

SXA Speedy – Supercharge your SXA Page Speed Scores in Google

We are excited to preview our latest Open Source module. Before jumping into the actual technical details here are some of the early results we are seeing against the Habitat SXA Demo.


Results:

Results

Before:

After

After:

Before
* Results based on Mobile Lighthouse Audit in chrome. 
* Results are based on a local developer machine. Production results usually incur an additional penalty due to network latency.

Want to know more about our latest open source SXA Sitecore module …. read on ….


I’m continually surprised by the number of new site launches that fail to implement Google recommendations for Page Speed. If you believe what Niel Patel has to say this score is vitally important to SEO and your search ranking. At Aceik it’s one of the key benchmarks we use to measure the projects we launch and the projects we inherit and have to fix.

The main issue is often a fairly low mobile score, desktop tends to be easier to cater for. In particular, pick out any SXA project that you know has launched recently and even with bundling properly turned on its unlikely to get over 70 / 100 (mobile score). The majority we tried came in somewhere around the 50 to 60 out 100 mark.

Getting that page score into the desired zone (which I would suggest is 90+) is not easy but here is a reasonable checklist to get close.

1) Introduce image lazy loading
2) Ensure a cache strategy is in place and verify its working.
3) Dianoga is your friend for image compression
4) Use responsive images (must serve up smaller images sizes for mobile)
5) Introduce Critical CSS and deferred CSS files
7) Javascript is not a page speed friend. Defer Defer Defer

The last two items are the main topics that I believe are the hardest to get right. These are the focus of our new module.

Critical_plus_defer

Check out the GitHub repository.

I have also done an installation and usage video.

So how will the module help you get critical and JS defer right?

Deferred Javascript Load

For Javascript, it uses a deferred loading technique mentioned here. I attempted a few different techniques before finding this blog and the script he provides (closer to the bottom of the article) seems to get the best results.  It essentially incorporates some clever tactics (as mentioned in the article) that defer script load without compromising load order.

I also added in one more technique that I have found useful and that is to use a cookie to detect a first or second-time visitor. Second-time visitors naturally will have all external resources cached locally, so we can, therefore, provide a completely different loading experience on the 2nd pass. It stands to reason that only on the very first-page load we need to provide a deferred experience.

Critical + Deferred CSS Load

For CSS we incorporated the Critical Viewport technique that has been recommended by Google for some time. This technique was mentioned in this previous blog post. Generating the Critical CSS is not something we want to be doing manually and there is an excellent gulp based package that does this for you.

It can require some intervention and tweaking of the Critical CSS once generated, but the Gulp scripts provided in the module do seek to address/automate this.

Our module has a button added into the Configure panel inside the Sitecore CMS. So Content Editors can trigger off the re-generation of the Critical CSS when ever needed.

Generate Critical button added to Configure.

Local vs Production Scores

It’s also important to remember that the scores you achieve via Lighthouse built into Chrome on localhost and your non-public development servers can be vastly different than production. In fact, it’s probably safest to assume that non-production boxes give false positives in the region of 10 to 20 points. So it’s best to assume that your score on production will be a little worse than expected.

Conclusion

It’s a fair statement that you can’t just install the module and expect Page Load to be perfect in under 10 minutes.  Achieving top Page Load Speed’s requires many technical things to work together. By ensuring that the previously mentioned checklists are done (Adequate Servers, Sitecore Cache, Image Loading techniques) you are partway over the line. By introducing the deferred load techniques in the module (as recommended by Google) you should then be a step closer to top score.

For more hints please see the Wiki on Github.

This module has been submitted to the Sitecore Marketplace and is awaiting approval.


Author: Thomas Tyack – Solutions Architect / Sitecore MVP 2019

1 comment on “Part 2: Experience Profile – Multi-site, Multi-domain tracking”

Part 2: Experience Profile – Multi-site, Multi-domain tracking

This blog examines a way in which you can effectively track a user across multiple sites with different domains. It demonstrates how visitor data collected across different top-level domains can be merged together. Merged visitor data provides a much clearer picture of a users movements (experience profile timeline) when a brand consists of many different websites.

You can view Part 1, Part 3 and  Part 4 via the respective links.


Note:  Tracking across sub-domains is a much easier task in Sitecore and is solved using the setting “Analytics.CookieDomain”. The problem this blog is solving is about top level domains not being able to identify and merge visits. This is due to the fact that top-level domains cannot access each others analytics cookie (for very valid security reasons). The problem is summarised well here.


Why is this necessary?

To start off with let’s define the current problem that required this solution.

Problem Definition:

Given that two websites are hosted in Sitecore under unique top level domains. If a visitor goes to http://www.abc.com.au and then visits http://www.def.com.au the same visitor is not linked and the Experience Profile shows two different visitors.

Solution:

The solution Aceik came up with for this is best depicted with a simple diagram.

GlobalCookie

(download a bigger diagram)

The different technologies at play in the diagram include:

  1. Multiple Sitecore Sites
  2. Google Analytics or GTM
  3. IFrames – running javascript, using PostMessage, with domain security measures.
  4. Cookies – Top Level Domain

What is happening in Simple Terms?

Essentially a user visits any one of your Sitecore sites and we tell XDB how it can identify that user by using the assigned Google Client ID.

To do this we use a combination of IFrame message passing between a group of trusted sites and cookies that give us the ability to go back and get the original Client ID for reuse.

By reusing the same Google Client ID on all Sitecore sites we can produce a more complete user profile. One that will contain visit data from each website. The resulting timeline of events in the Experience Profile will now display a federated view across many domains.


A More Technical Explanation

This is a more detailed Technical explanation. Let’s break it down into the sequence of steps a user could trigger and the messages being passed around.

First visit to “def.com.au”:

  1. User navigates to “def.com.au”.
  2. Google analytics initializes on page load with a random Client ID (CID) assigned.
  3. Async JavaScript is used to inject an invisible IFrame into the page. The IFrame URL is pointed at “abc.com.au/cookie”.
  4. Once the IFrame has completed loading the URL JavaScript obtains the CID from Google and uses “PostMessage” to pass it through to the “Global CID Cookie”.
  5. The “Global CID Cookie” has no prior value set so is updated with the CID passed in.
  6. The cookie page responds using “PostMessage” and sends back the same CID passed in.
  7. JavaScript on the page “def.com.au” receives the message and stores the CID received in the “Domain CID Cookie”.
  8. JavaScript on the page triggers backend code that will identify the user against the CID and merge all visits for that user.

Later visit to “hij.com.au”

  1. User navigates to “hij.com.au”.
  2. Google analytics initialisers on page load with a random Client ID (CID) assigned.
  3. Async JavaScript is used to inject an invisible IFrame into the page. The IFrame URL is pointed at “abc.com.au/cookie”.
  4. Once the IFrame has completed loading the URL JavaScript obtains the CID from Google and uses “PostMessage” to pass it through to the “Global CID Cookie”.
  5. The “Global CID Cookie” has a prior value, so the CID passed in is not used or set.
  6. The cookie page responds using “PostMessage” and sends back the prior existing CID stored from the first visit to”def.com.au”.
  7. JavaScript on the page “hij.com.au” receives the message and stores the CID received in the “Domain CID Cookie”.
  8. JavaScript on the page triggers backend code that will identify the user against the CID.  The current visit data for  “hij.com.au” is merged with the previous visit data from “def.com.au”.
  9. JavasScript on the page updated the CID stored in Google Analytics so that no other actions use the newer CID generated by “hij.com.au”.
  10. When the page is refreshed the Google Analytics initialisation code checks the “Domain CID Cookie” and passes through the existing CID to google for continued use.

Last visit to “abc.com.au”

  1. Google analytics initialisers on page load, checks for the existence of the “Global CID Cookie” and passes through the existing CID to google for continued use.
  2. Javascript on the page notices that “Global CID Cookie” is set but the “Domain CID Cookie” is not.
  3. JavaScript on the page triggers backend code that will identify the visit against the CID.  The current visit data for  “abc.com.au” is merged with the previous visit data from “def.com.au” and “hij.com.au”.

Second-time visits are an easy win

Once each domain has completed a first pass of checking for a Global CID the code is smart enough that it doesn’t need to repeat all these steps again.

The first pass sets a local domain cookie and this is used for quick access on each subsequent page load. The first pass is also coded in an async way so that it will not have an impact on page load times.

We can also set it up so that the code initialising the google tracker is instantly provided with the multi-domain Client ID straight away.

https://www.useloom.com/share/0b8c9f0a2fbc41c0baa1ae4db5e9ae6b

This loom video shows exactly how we set up the global page view event with two variables. The final variable runs custom javascript that will read our cookie to grab the Client ID.

The javascript to paste into the GTM variable is:

function() {
var getGAPrior = function(name) { var value = "; " + document.cookie; var parts = value.split("; " + name + "="); if (parts.length == 2) return parts.pop().split(";").shift(); }
var localCookie = getGAPrior("LocalCidCookie"); if (typeof localCookie !== "undefined") { return localCookie; }
var globalCookie = getGAPrior("GlobalCidCookie"); if (typeof globalCookie !== "undefined") { return globalCookie; }
return null;
}

What about FXM ?

You could potentially solve this same problem using a customisation around FXM.  It may be possible for FXM to replace the need for IFrame communication.

In order to achieve this, you would need to write a customisation to the Beacon service that allowed a Google Client ID to be sent and received.

However, the main sticking point remains around the need to maintain a global storage area (Global Cookie) that is owned by the user.  Due to the browser limitations (noted by Sitecore) of passing cookies, I’m not entirely sure an FXM replacement will work.

Compare that with browser support for IFrame “PostMessage” and you can see why we travelled down one rabbit hole compared to another.

Reference: https://community.sitecore.net/developers/f/10/t/380


Conclusion

Every website visitor gets assigned a Google Analytics Client ID.  You can use this ID to identify a user in XDB very early on. For multi-site tracking, the Client ID supplied by Google comes in very handy.  By using some clever communication tactics between your sites you can merge all the users visit data into a single profile view. The resulting timeline of events in the Experience Profile should prove very useful to your marketing teams.

To continue reading, jump over to Part 3 where we cover off External Tracking via FXM and Google Client ID.

0 comments on “Aceik – Two MVPs in 2019”

Aceik – Two MVPs in 2019

This year team Aceik have set ourselves new goals, new challenges and new benchmarks in the Sitecore space. Just one month into 2019 and we’re absolutely thrilled to find these goals already being achieved.

Our New Year’s resolutions are off to a great start with our team already setting the benchmark in global Sitecore excellence. Two of our incredible staff members – Jason Horne and Thomas Tyack, have been awarded with the prestigious Sitecore Most Valuable Professional (MVP) awards for 2019.

MVPs are recognized by Sitecore as being global leaders in Sitecore implementation and for proactively utilising their knowledge within the wider community. Both Jason Horne and Thomas Tyack go above and beyond in their spare time to share their knowledge with their wider networks and to strengthen their own expertise.

So, what makes Thomas and Jason among the global best? Jason Horne is a fantastic leader, 4 times MVP, founder and CEO of Aceik. He also gives back to the Sitecore community by running the only Sitecore meet up group in Melbourne. Thomas Tyack also wears many hats at Aceik. He is the Queensland Technical Lead, Architect and Senior Developer for our team and carries this workload with expert professionalism and advanced skill.

Our passion as a team is reflected in the work we do and we ensure every client is fully across every project, start to finish. Transparency is key in our business and we work with our clients to ensure every project goes above and beyond the benchmark. Over the past five years, we’ve ensured a 100% success rate in project delivery to the highest standard. Our efficient yet diligent work is what makes us stand out from the rest.

We are absolutely thrilled to have two globally-recognised Sitecore MVPs as part of our team, a true accolade to the work that goes on behind the scenes. A huge congratulations to other 2019 MVPs around the world, we can’t wait to see what this year brings!

MVP    jaaaaaaason    Thomas

0 comments on “Geocoding Australian postcodes”

Geocoding Australian postcodes

While working on some code that allowed a user to perform a search using only a postcode, I discovered some strange behaviour with the Google Maps API.

According to the Google Maps API documentation (https://developers.google.com/maps/documentation/geocoding/intro#ComponentFiltering), component filtering allows you to filter by postal_code and country, which would suit this need perfectly. I gave this a try, and upon initial testing, it seemed that this was the solution, however, after further testing, it was found that for some postcodes (specifically, some in NSW and ACT), the geocoding query would return ZERO_RESULTS. Maybe this is because there is an overlap in the 2000 series postcodes ¯\_(ツ)_/¯.

An example of the URL I was using for this is shown below (note that postcode 2022 and country AU will return ZER0_RESULTS):

http://maps.googleapis.com/maps/api/geocode/json?components=country%3aAU%7Cpostal_code:2022&sensor=false&client=your_client&channel=web&language=en&signature=your_signature

There are many examples on Stack Overflow of people using this format to search by postcode and claim this to be the solution, but most of them are either from other countries, where this probably isn’t an issue, or they mustn’t have discovered this issue.

According to the Google Maps API documentation, you can use administrative_area (among other fields) to “influence” results, so I tried adding the state to this field, and I found that this made everything work properly. That means that the following URL will geocode the postcode 2022:

http://maps.googleapis.com/maps/api/geocode/json?&components=country%3aAU%7Cpostal_code%3a2022%7Cadministrative_area%3aNSW&sensor=false&client=your_client&channel=web&language=en&signature=your_signature

The issue I had then was that if the user is searching only using the postcode, I had to find a way to provide the state for that postcode to Google so it could geocode the postcode properly. To do this, I created a function that gives me the state based on a postcode as shown below (although this is not a perfect solution, because new postcodes are added from time to time. Potentially a call to an Australia Post API or similar may work better going forward):


public static string PostcodeToState(int postcode)
{
var postcodes = new List();
postcodes.AddRange(Enumerable.Range(1000,1000).Select(x => new KeyValuePair("NSW",x)));
postcodes.AddRange(Enumerable.Range(2000, 600).Select(x => new KeyValuePair("NSW", x)));
postcodes.AddRange(Enumerable.Range(2619, 280).Select(x => new KeyValuePair("NSW", x)));
postcodes.AddRange(Enumerable.Range(2921, 79).Select(x => new KeyValuePair("NSW", x)));

postcodes.AddRange(Enumerable.Range(200, 100).Select(x => new KeyValuePair("ACT", x)));
postcodes.AddRange(Enumerable.Range(2600, 19).Select(x => new KeyValuePair("ACT", x)));
postcodes.AddRange(Enumerable.Range(2900, 21).Select(x => new KeyValuePair("ACT", x)));

postcodes.AddRange(Enumerable.Range(3000, 1000).Select(x => new KeyValuePair("VIC", x)));
postcodes.AddRange(Enumerable.Range(8000, 1000).Select(x => new KeyValuePair("VIC", x)));

postcodes.AddRange(Enumerable.Range(4000, 1000).Select(x => new KeyValuePair("QLD", x)));
postcodes.AddRange(Enumerable.Range(9000, 1000).Select(x => new KeyValuePair("QLD", x)));

postcodes.AddRange(Enumerable.Range(5000, 1000).Select(x => new KeyValuePair("SA", x)));

postcodes.AddRange(Enumerable.Range(6000, 798).Select(x => new KeyValuePair("WA", x)));
postcodes.AddRange(Enumerable.Range(6800, 200).Select(x => new KeyValuePair("WA", x)));

postcodes.AddRange(Enumerable.Range(7000, 1000).Select(x => new KeyValuePair("TAS", x)));

postcodes.AddRange(Enumerable.Range(800, 200).Select(x => new KeyValuePair("NT", x)));

postcodes.Add(new KeyValuePair("ACT", 2620));
postcodes.Add(new KeyValuePair("NSW", 3644));
postcodes.Add(new KeyValuePair("NSW", 3707));

return postcodes.Where(x => x.Value == postcode).Select(x => x.Key).FirstOrDefault();
}

 

0 comments on “Congratulations! You’re Sitecore Certified”

Congratulations! You’re Sitecore Certified

Our developers are Sitecore 9 certified

We’d like to congratulate all our developers who’ve achieved Sitecore® 9.0 Certified Platform Associate Developer certification so far.

Sitecore’s certification exams validate the skills and knowledge of developers, marketers and business users. Test takers who pass the certification exams earn the distinction of being Sitecore Certified Professionals.

At Aceik we work exclusively in Sitecore and have formed a strategic relationship with Sitecore. All our developers are Sitecore certified and work towards certification in the latest Sitecore, are highly qualified and totally dedicated to their projects.

Please contact us for more details or if you want to discuss a project.  All enquiries should be made to info@aceik.com.au or +61 (0)4 2697 1867

 

 

0 comments on “Sitecore Page Speed: Part 3: Eliminate JS Loading Time”

Sitecore Page Speed: Part 3: Eliminate JS Loading Time

In part 1 & part 2 of our Sitecore page speed blog, we covered off:

  • The Google Page Speed Insights tool.
  • We looked at a node tool called critical that could generate above the fold (critical viewport) CSS code that is minified.
  • We referenced the way in which Google recommends deferring CSS loading.
  • We showed a way to integrate “Above The Fold” CSS into a Helix based project and achieve a page free of render blocking CSS.

In this 3rd part of the series, we will introduce a way to defer the load of all external javascript assets (async).

A reminder that I have committed the sample code for this blog into a fork of the helix habitat example project. You can find the sample here. For a direct comparison of changes made to achieve these page load enhancements, view a side by side comparison here.

Dynamic JS loading Installation Steps:

  1. Inside Sitecore add a new view Rendering that reference the file /Views/Common/Assets/Scripts-3.2.1.cshtml
    • Note down the ID of this rendering and replace in the ID of the rendering in the next step.
  2. Update the Default.cshtml layout to include a new cached rendering.
  3.  @*Scripts Legacy Jquery jquery-3.2.1 *@
     @Html.Sitecore().CachedRendering("{B0DD36CE-EE4A-4D01-9986-7BEF114196DD}", new RenderingCachingSettings { Cacheable = true, CacheKey = cacheKey + "_bottom_scripts" })
    • cacheKey = This variable is something unique that will identify the page. You could use the Sitecore context Item ID or path for example.

Explanation:

The rendering Scripts-3.2.1.cshtml will render out the following javascript onto the page:

var scriptsToLoad = ['//cdnjs.cloudflare.com/ajax/libs/modernizr/2.8.3/modernizr.min.js','//maxcdn.bootstrapcdn.com/bootstrap/3.3.7/js/bootstrap.min.js','/assets/js/slick.min.js','/assets/js/global.js','/assets/js/Script.js','/assets-legacy/js/lib/lazyload.min.js'];
src="/assets-legacy/js/lib/jquery-3.2.1.min.js" async defer>
  • First of all, it prints out a JS array of all the scripts that this page requires.
    • This is the array of JS files that comes from Themes and Page Assets inside the CMS. If you are familiar with Habitat Helix this list can be content managed inside the CMS.
  • It then instructs the jquery library to be loaded async (which will not block the network download of the page response).
  • Once jquery is loaded, this modified version of jquery contains some code at the end that will read in the list of scripts dynamically and apply them to the page.
    • This is achieved with fairly simple AJAX load calls to the script URLs.

Outcome:

Once integrated successfully you will end up with a page that does not contain any blocking JS network calls.  The Google Page Speed tool should give you a nice score boost for your achievement in reducing initial load time.


Hints and Tips:

Bootstrapping JQuery Code:

  • jquery Document.Ready() function calls may not fire inside dynamically loaded JS files. This is because the JS file is loaded after DOM is ready and it’s too late for the Document.Ready() event at this stage.
  • As a workaround, you could code your JS files to bootstrap on both the Document.Ready() or whenever $ is not undefined.
  • In the case of dynamic loading in this manner, because jquery was loaded first, $ should not be undefined and your code should be bootstrapped successfully.

Debugging in chrome:

  • When dynamically loading JS files they may strangely not appear in the chrome console debugger as you would normally expect.
  • The workaround for this is to add a comment to the top of each JS library
  • //# sourceURL=global.js
  • This will cause the chrome debugger to list the file in the source tab under the “(no domain)” heading.
  • You will then be able to debug the file as per normal.
0 comments on “Aceik announces FuseIT (S4S) partnership”

Aceik announces FuseIT (S4S) partnership

Aceik works directly with FuseIT, principally by aiding customers in making the link between their Sitecore visits and their Salesforce contacts through FuseIT’s industry standard S4S (Sitecore – Salesforce) connector. We handle the implementation so that the clients’ Sitecore analytics are surfaced in Salesforce, allowing the sales team to track the behavior of individual website visitors then personalize their experience.

 

About FuseIT

FuseIT is a private company founded in New Zealand in 1992. FuseIT has a history of building enterprise content management systems but our real forte is complex systems integration. Our qualified and highly capable team of professionals have proven experience in consulting, business analysis, solution architecture, software development, integration and innovation.

 

About Aceik

We’re a tight-knit team of Melbourne developers who are true craftspeople, creating cutting-edge solutions with the latest Sitecore technology, solving business problems while forging long-term client connections.

0 comments on “Aceik announces Stackla partnership”

Aceik announces Stackla partnership

Aceik enjoys a partnership with Stackla, the leading user-generated content (UGC) platform for enterprise brands. We facilite Australian integrations of Stackla into the Sitecore Experience platform so that our customers can deliver highly personalized brand experiences with fresh, real-time content.

 

About Stackla

Stackla is the world’s smartest visual content engine, helping modern marketers discover, manage and display the best content across all their marketing touchpoints.

With our AI-powered user-generated content (UGC) platform and asset manager, Stackla sits at the core of the marketing stack, actively discovering and recommending the best visual content from across the social web, as well as internal and external resources, to fuel personalized content experiences at scale.

Using Stackla, marketers can spend less time creating expensive creative assets and more time delivering relatable and influential visual customer experiences.

 

About Aceik

We’re a tight-knit team of Melbourne developers who are true craftspeople, creating cutting-edge solutions with the latest Sitecore technology, solving business problems while forging long-term client connections.

0 comments on “Sitecore Azure Search: Top 10 Tips”

Sitecore Azure Search: Top 10 Tips

Its been a while since I first wrote about Azure Search and we have a few more tips and tricks on how to optimise Azure Search implementations.

Before proceeding if you missed our previous posts check out some tools we created for Azure Search Helix setup and Geo-Spatial Searching.

Also, check out the slides from our presentation at last years Melbourne Sitecore User Group.

Ok let us jump into the top 10 tips:

Tip 1) Create custom indexes for targeted searching

The default out of the box indexes will attempt to cover just about everything in your Sitecore databases. They do so to support Sitecore CMS UI searches out of the box.  It’s not a problem if you want to use the default indexes (web, master) to search with, however for optimal searches and faster re-indexing time a custom index will help performance.

By stepping back and looking at the different search requirements across the site you can map out your custom indexes and the data that each will require.

Consider also that if the custom indexes need to be used across multiple Feature Helix modules the configuration files and search repositories may need to live in an appropriate Foundation module. More about feature vs foundation can be found here.

Tip 2) Keep your indexes lean

This tip follows on from the first Tip.

Essentially the default Azure Search configuration out of the box will have:

<indexAllFields>true</indexAllFields>

This can include a lot of fields and your probably not going to need every single Sitecore field in order to present the user with meaningful data on the front end interfaces.

The other option is to specify only the fields that you need in your indexes:

<include hint="list:IncludeField"> 
<Text>{A60ACD61-A6DB-4182-8329-C957982CEC74}</Text> 
</include>

The end result will limit the amount of JSON payload that needs to be sent across the network and also the amount of payload that the Sitecore Azure Search Provider needs to process.

Particularly if you are returning thousands of search results you can see what happens when “IndexAllFields” is on via Fiddler.

This screenshot is via a local development machine and Azure Search instance at the Microsoft hosting centre.

Fiddler Index

JSONFIelds

  • So for a single query “IndexAllFields” can result in:
    • 2 MB plus JSON payload size.
    • Document results with all Sitecore metadata included. That could be around 100 fields.

If your query results in Document counts in the thousands obviously the payload will grow rapidly. By reducing the fields in your indexes (removing un-necessary data)  you can speed up query, transfer and processing times and get the data displayed quicker.

Tip 3) Make use of direct azure connections

Sitecore has done a lot of the heavy lifting for you in the Sitecore Azure Search Provider. It’s a bit like a wrapper that does all the hard work for you. In some cases however you may find that writing your own queries that connect via the Azure Search DLL gives you better performance.

Tip 4) Monitor performance via Azure Search Portal

It’s really important to monitor your Azure Search Instance via Azure Portal. This will give you critical clues as to whether your scaling settings are appropriate.

In particular look out for high latency times as this will indicate that your search queries are getting throttled. As a result, you may need to scale up your Azure Search Instance.

In order to monitor your latency times go to:

  1. Login to Azure Portal
  2. Navigate to your Azure Search Instance.
  3. Click on metrics in the left-hand navigation
    • metrics
  4. Select the “Search Latency” checkbox and scan over the last week.
    • graph
  5. You will see some peaks these usually indicate heavy periods of re-indexing. During re-indexing, the Azure Search instance is under heavy load. As long as your peaks under 0.5-second mark your ok.  If you see Search Latency up into the 2-second timeframe you probably need to either adjust how your indexes are used (caching and re-indexing) or scale up to avoid the flow on effects of slow search.

Tip 5) Cache Wrappers

In the code that uses Azure Search, it would be advisable to use cache wrappers around the searches when possible. For your most common searches, this should prevent Azure Search getting hit repeatedly with the same query.

For a full example of cache wrapper checkout the section titled Sitecore.Caching.CustomCache in my previous blog post.

Tip 6) Disable Indexing on CD

This is a hot tip that we got from Sitecore Support when we started to encounter high search latency during re-indexing.

Most likely in your production setup, you will have a single Azure Search instance shared between CM and CD environments.

You need to factor in that CM should be the server that controls the re-indexing (writing) and CD will most likely be the server doing the queries (reading).

Re-indexing is triggered via the event queue and every server subscribes and reacts to these events. Each server with the out of the box search configuration will cause the Azure Search indexes to be updated.  In a shared Azure Search (or SOLR instance) this only needs to be updated by a single server. Each additional re-index is overkill and just doubling up on re-indexing workload.

You can, therefore, adjust the configuration on the CD servers so that it does not cause re-indexing to happen.

The trick is in your index configuration files to use Configuration Roles to specify the indexing strategy on each server.

 <strategies hint="list:AddStrategy">
 <!--
 NOTE: order of these is controls the execution order 
 -->
 <strategy role:require="Standalone OR ContentManagement" ref="contentSearch/indexConfigurations/indexUpdateStrategies/onPublishEndAsync"/>
 <strategy role:require="ContentDelivery" ref="contentSearch/indexConfigurations/indexUpdateStrategies/manual"/>
 </strategies>

Setting the index update strategy to manual on your CD servers will take a big load off your remote indexes.

Particularly if you have multiple CD servers using the same indexes. Each additional CD server would cause additional updates to the index without the above setting.

Tip 7) Rigid Indexes – Have a deployment plan

If your deployment includes additions and changes to the indexes and you need 100% availability of search data, a deployment plan for re-indexing will be required.

Grant chatted about the problem in his post here. To get around this you could consider using the blue / green paradigm during deployments.

  • This would mean having a set blue indexes and a set of green indexes.
  • Using slot swaps for your deployments.
    • One slot points to green in configuration.
    • One slot (production) points to blue in configuration.
  • To save on costs you could decommission the staging slot between deployments.

Tip 8) HttpClient should be a singleton or static

The basic idea here is that you should keep the number of HttpClient instances in your code to an absolute minimum if you want optimal performance.

The Sitecore Azure Search provider actually spins up 2 x HttpClient connections for every single index. This in itself is not ideal and unfortunately, there is not a lot you can do about this code in the core product itself.

In your own connections to other APIs, however, HttpClient SendAsync is perfectly thread safe.

By using HttpClient singletons you stand to gain big in the performance stakes. One great blog article worth reading runs you through the performance benefits. 

It’s also worth noting that in the Azure Search documentation Microsoft themselves say you should treat HttpClient as a singleton.

Tip 9) Monitor your resources

In Azure web apps you have finite resources with your app server plans. Opening multiple connections with HttpClient and not disposing of them properly can have severe consequences.

For instance, we found a bug in the core Sitecore product that was caused by the connection retryer. It held open ports forever whenever we hit out Azure Search plan usage limits.  The result was that we hit outbound open connection limits for sockets and this caused our Sitecore instance to ground to a slow halt.

Sitecore has since resolved the issue mentioned above after a lengthy investigation working alongside the Aceik team. This was tracked under reference number 203909.

To monitor the number of sockets in Azure we found a nice page on the MSDN site.

Tip 10) Make use of OData Expressions

This tip relates strongly to tip 3.  Azure search has some really powerful OData Expressions that you can make use of by a direct connection.  Once you have had a play with direct connections it is surprisingly easy to spin up really fast queries.

Operators include:

  • OrderBy, Filter (by field), Search
  • Logical operators (and, or, not).
  • Comparison expressions (eq, ne, gt, lt, ge, le).
  • any with no parameters. This tests whether a field of type Collection(Edm.String) contains any elements.
  • any and all with limited lambda expression support.
  • Geospatial functions geo.distance and geo.intersects. The geo.distance function returns the distance in kilometres between two points.

See the complete list here.


 

Q&A

Q) Anything on multiple region setups? Or latency considerations?

A) Multi-region setups:   Although I can’t comment from experience the configuration documentation does state that you can specify multiple Azure Search instances using a pipe separator in the connection string.

<add name="cloud.search" connectionString="serviceUrl=https://searchservice1.search.windows.net;apiVersion=2015-02-28;apiKey=AdminKey1|serviceUrl=https://searchservice2.search.windows.net;apiVersion=2015-02-28;apiKey=AdminKey2" /> 

Unfortunately, the documentation does not go into much detail. It simply states that “Sitecore supports a Search service with geo-replicated scenarios” which one would hope means under the hood it has all the smarts to take care of this.

I’m curious about this as well and opened a stack overflow ticket. Let’s see if anyone else in the community can answer this for us.

Search Latency: 

Search latency can be directly improved by adding more replicas via the scaling setting in Azure Portal

replicas

Two replicas should be your starting point for an Azure Search instance to support Sitecore. Once you launch your site you will need to follow the instruction in tip 4 above monitor search latency.  If the latency graph is showing consistent spikes and high latency times above 0.5 seconds it’s probably time to add some more replicas.