Frequently asked questions - Search

FAQ

Task Queue

Question: Why tasks are waiting longer in queues?

Answer: Updates are removed from the queue when there are enough resources available to complete the update without error. Availability of resources through out the day vary depending on when other clients have scheduled there updates and when system tasks are scheduled to run. Since there has been an increase of clients on the server any previous averages we have calculated will be out of date. Based on feedback on the task queue we have identified the following items to look at

  1. Talk to staff about scheduling updates for a specific time to distribute load better and reduce task time.

  2. Review resource usage and increase if required

  3. Review task queue logs to review average time in queue over the day

  4. Setup monitoring and alerts of task times (medium/long term)

Question: Why broken links report is not working as expected?

Answer: Broken links report is that it’s not what you think it is. Normally you would expect a broken links report to check a site for all broken links, but this one just reports on what the crawler couldn’t fetch. It doesn’t report on things like linked resources (images, css, js etc) or external links - both things that you would usually expect to see in a broken links report.

Question: My push collections have failures while adding documents.

Answer: Occasionally, pushing documents fails with an exception stating that the license limit has been exceeded. If you are certain the client is still well within the license limit, follow these steps to successfully push documents again:

Visit Squiz DXP Push API Stop from the admin panel to access the API UI.

Add the collection ID to the API and execute it. This will stop the push collection and release the lock.

Add a test document to the same push collection using Squiz DXP Add Documents to Push API. You can use any test document and test key.

Once successful, use Squiz DXP Delete documents from Push Index with the same collection and test key used for adding the document.

If all the above steps are successful, your push index is reset, and you can begin pushing actual documents.

Note: The false license limit exception often occurs due to the calculation time required for a large number of collections belonging to one client with many documents.

Sometimes push fails with 500 internal server error:

Usually the cause is too many concurrent requests. Please make sure that you have applied below settings applied to push collection:

#AUTO-PROVISIONcollection-update.step.AnnieAMeta.run=false
#AUTO-PROVISION collection-update.step.AnnieAPrimaryCollection.run=false
#AUTO-PROVISION collection-update.step.BuildAutoCompletion.run=false
#AUTO-PROVISION collection-update.step.BuildAutoCompletionForMeta.run=false
#AUTO-PROVISION collection-update.step.BuildCollapsingSignatures.run=false
#AUTO-PROVISION collection-update.step.BuildMatchOnlyIndexForAutoc.run=false
#AUTO-PROVISION collection-update.step.BuildSpelling.run=false
#AUTO-PROVISION collection-update.step.BuildSpellingForMeta.run=false
#AUTO-PROVISION collection-update.step.ClickLogs.run=false
#AUTO-PROVISION collection-update.step.ContentAuditorSummary.run=false
push.commit-typepush.commit-type=FAST
push.commit.index.parallel.max-index-thread-count
push.commit.index.parallel.max-index-thread-count=4
push.scheduler.changes-before-auto-commit
push.scheduler.changes-before-auto-commit=1000
push.scheduler.auto-commit-timeout-seconds
push.scheduler.auto-commit-timeout-seconds=900

Once your job has run you will need to run a vaccum with vaccum type “Merge” via admin api ui:

Re-indexing and merging push indexes
POST /v1/collections/{collection}/vacuum

Question: Why my analytics is missing?

Answer: First we need to check if client was migrated to DXP pretty recently from on-prem servers. If so, most of the time we don’t have access to their analytics log files and hence analytics is missing in DXP.

If client was migrated for long enough and still doesn’t show their analytics, Rundeck is the place to go: 1. Login to the Rundeck with appropriate region. e.g. for US: https://rundeck-us.tools.funnelback.squiz.cloud:8443/project/SaaS/jobs 2. Select “Generate Analytics Report” under “Migration” option 3. Select correct collection name from the drop down 4. Select “All” in year dropdown 5. Select “Yes” in delete-old-database 6. Execute the job Check if analytics is working.

If the client is on Squiz Cloud submit a DXPSUP ticket for Search team to take care of the issue.