AdWords
2K members online now
2K members online now
For questions related to Google Shopping and Merchant Center. Learn to optimize your Shopping ads
Guide Me
star_border
Reply

Mitigating Product Feed Downtimes

Follower ✭ ☆ ☆
# 1
Follower ✭ ☆ ☆

Hi Adwords Community,

 

I would like to change from submitting my Google product feed via a third party feed compiler, to submitting directly to Google. However, I have been told that this will cause a downtime on the feed serving in Google Shopping as it reprocesses receiving the feed from a new source.

Are they any ways I can either a.) avoid this, or b.) mitigate this so that the downtime is as minimal as possible?

If I were to create a new feed, for example, in the Merchant Center could I process that one, while still running on the old one, and then switching over?

 

Any other thoughts/tips?

Thanks,

 

Marcus

2 Expert replyverified_user

Re: Mitigating Product Feed Downtimes

[ Edited ]
Top Contributor
# 2
Top Contributor

to mitigate issues during such a transition relates
mainly to process and submitted data (feed) details.

if the current third-party data (feed) and the new feed,
will be using the same file-format, that usually helps --
e.g. tab-delimited vs google-xml vs the api.

the best likely course would be:
(a) use the exact, same, id values in the new feed as the third-party data;
(b) use the exact, same, values for all attributes used in shopping-campaigns;
(c) if possible, use the same, current, feed-format and file-name;
(d) prepare the new feed well in advance of the transition (e.g. weeks);
(e) register a test-feed using a different, temporary, file-name for testing;
(e) if needed, update any and all test-feed related settings;
(f) test the contents and process-flow using the registered test-feed;
(g) use the test-feed to test uploads using scheduled-fetch, ftp, or both;

as was indicated, the timing for the test-feed should
be well in advance of the actual live-feed transition --
on the order of weeks or months if possible.

after a test-feed (content) and process-flow (submits) have been fully tested:
(1) pause any and all scheduled-fetches within the merchant-center;
(2) pause and permanently stop any server submits (ftp, etc.) via the third-party;
(3) on the local machine -- re-name the test-feed file to the current feed name;
(4) if needed, update any and all feed related settings;
(5) submit the new feed -- if possible with the same feed-file format and name;
(6) carefully and closely monitor feed processing status;
(7) monitor feed-status, per-item status, and diagnostics for at least 72-hours;
(8) carefully monitor all shopping-campaigns for at least 72-hours;
(9) finally, update any scheduled-fetch or ftp details, or both.

if the same feed-file name or the same feed-file format cannot be used,
simply follow all the steps -- especially pausing and stopping any and
all current-third-party-feed feed updates (scheduled-fetch, ftp, etc) --
and then delete the current feed and register a new feed, just after

completing a-g testing (add a step 2a).

if the exact, same, id values and the exact, same, values for all
attributes used within shopping-campaigns cannot be used then, the
shopping-campaigns will need to change and google will consider all
items to be fresh -- all quality and historic information will be
lost and a re-review will be triggered for all items and the site.

normally, items expire automatically every 30-days --
and should re-updated about 5-days before the 30-day
default expiration and immediately after any critical
change to the website or physical inventory availability.

if possible, the timing should be soon after the latest 30-day re-update
and not during any critical changes to the website or on-hand inventory.

see also

https://support.google.com/merchants/answer/160567

https://support.google.com/merchants/answer/1188998

https://support.google.com/merchants/answer/1344057

https://support.google.com/merchants/answer/188477

https://support.google.com/adwords/topic/6275305

 

Re: Mitigating Product Feed Downtimes

Rising Star
# 3
Rising Star
Hello you can avoid down time by doing the following.

If your old data feed URL is XML and you submit the same XML data information than there will be no down time. All you have to do is edit the schedule of the original feed and update the link.

If you change file format, there will be most likely a 3 working day delay.

Hope it helps
Twitter | Linkedin | Community Profile | Shopping Feed Tips From FeedArmy
Did you find any helpful responses or answers to your query? If yes, please click on ‘Accept As Solution’

Re: Mitigating Product Feed Downtimes

[ Edited ]
Top Contributor
# 4
Top Contributor

what needs updating depends on how the third-party
is currently submitting the data -- for example, if ftp
is currently being used by the third-party then, simply

an edit of the scheduled-fetch url may have no effect

or may cause a rather serious conflict.

if scheduled-fetch is currently being used by the third-party
then, of course, all that may need to be done is to edit the

the schedule's link url to use the feed file's new location.

as was indicated, both the submitted data and
the way the data is currently being submitted,
effect the details and any potential downtime --

the less that changes the less chance of issues.

 

Re: Mitigating Product Feed Downtimes

Follower ✭ ☆ ☆
# 5
Follower ✭ ☆ ☆
Thanks for the detailed reply Celebird.

We found another solution that appears to be working well. We tested taking a smaller chunk of items out of our feed submitted via our third party and added them, with the same unique ids, in a new feed we created submitting directly ourselves. Once they were removed from the original feed, they seamlessly and quickly switched over and processed in the new feed, while continuing to be served in Google Shopping - giving us virtually zero down time (i.e. not detectable). We've started to scale this process, and will continue until the whole feed has been migrated.

Thanks for your help,

Marcus

Re: Mitigating Product Feed Downtimes

[ Edited ]
Top Contributor
# 6
Top Contributor

first, you're welcome and thank you for

the update; that sounds like good news.

the one potential issue with this technique --
submitting a new feed with duplicate id values --
is that if the third-party submits their feed
during the transition, the new feed items
will be overridden.

having two active feeds serving the same data, even a subset,
creates the possibility of a race-condition -- whichever feed
submits last wins.

yes, immediately removing the subset from the old feed removes that possibility.

testing first, using a test-feed, even with only a
subset of data, is still strongly recommended.