question

$$anonymous$$ avatar image
0 Likes"
$$anonymous$$ asked ·

finding API's findItemsByKeywords duplicate itemid

I am using the finding API's findItemsByKeywords call to get the items back. From the API specification itemid should be unique always: ---------------------------------------- > Blockquote > searchResult.item.itemId > The ID that uniquely identifies the item listing. eBay generates this ID when an item is listed. ID > values are unique across all eBay sites. > Max length: 19 (normally, item IDs are 9 to 12 digits in length). Out of 1404 item there is significant number of duplicates, 3 of the duplicate itemid are below: ItemId | Count -------------- 321651618529 3 371220644532 2 390997080379 2 Can someone please help me from experience if my understanding is correct? And why am i getting duplicates where as the documentation clearly says the itemid is unique ebay wide.
finding-apifinditemsbykeywordsduplicates
· 1
10 |600 characters needed characters left characters exceeded

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.

Is there any solution to this? We really need to not have the API return duplicate Item Ids.
0 Likes 0 · ·
clickimusprime avatar image
1 Like"
clickimusprime answered ·
You can get this from variation listings like 301443308224. Using HideDuplicateItems hides these. Or from item edits or relisted item in some cases. For your case, it may be process related. You are reading at least 15 pages. The result set is not static. New listings get inserted into the sort order and items move up or down the pagination list while you are fetching them. So for example, you read 100 items from page1. Then a few new listings arrive and they get placed on page1 by the sort order in eBay's database. The insertion causes the same number of items to get bumped down to page2. When you read page2, the first few items are the items that were in the last few entires of page1 before the insert. You perceive these as duplicates. Deletions and sold items work in reverse and cause you to miss items by shifting the remaining items up in the sort order. Most affiliates don't care about getting all the items at once, so they just get a snapshot of items that meet their criteria and don't have any expectations about duplicates or completeness. But whenever you try to read all matches to do more complete processing, you will get the same item returned twice (or more) in busy categories. We used to deal with this by using StartTimeFrom/ModTimeFrom filters or StartTimeNewest sorts to make the return order consistent between calls. Previously, the order of these sorts/filters matched the order of the items being added to the feed. But these are no longer reliable because of the system delays introduced by the new system design. New items can be added to the feed 15 minutes after they are created, but they use the actual time they were created as starttime, rather than the time they hit the API. So even these results can be modified during processing. StartTimeFrom/ModTimeFrom are now only valid for items listed 20 minutes ago or more. More here: https://forums.developer.ebay.com/questions/11402/starttimenewest-sort-hides-items-in-finding-reques.html I submitted this to eBay and it was supposed to be escalated a month ago, but I haven't heard anything since. Basically a sort by publish time (when the item was added to the API) should restore the previous functionality provided by StartTimeNewest and a filter on that timestamp would address StartTimeFrom/ModTimeFrom and allow a static response to be generated.
· Share
10 |600 characters needed characters left characters exceeded

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.

helios825 avatar image
1 Like"
helios825 answered ·
In your API call, are you using the [HideDuplicateItems][1] filter? [1]: http://developer.ebay.com/DevZone/finding/CallRef/types/ItemFilterType.html#HideDuplicateItems
· Share
10 |600 characters needed characters left characters exceeded

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.

clickimusprime avatar image
0 Likes"
clickimusprime answered ·
**It does not restrict the duplicates based on itemid.** It would have the same effect. In any of the cases discussed, the duplicates you are receiving should match each other on all the listed fields. Unless you're getting items that are totally different with the same IDs. Since you didn't provide any details, we can't see that. But it's very unlikely and would more likely indicate a processing error on your end.
· Share
10 |600 characters needed characters left characters exceeded

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.

$$anonymous$$ avatar image
0 Likes"
$$anonymous$$ answered ·
Thanks for more details, but looking into detail of "HideDuplicateItems", it's not the best solution for my requirements. HideDuplicateItems details say: Item listings are considered duplicates when all of the following conditions are met: 1. Items are listed by the same seller 2. Items have exactly the same item title 3. Items have similar listing formats: - Auctions (Auction Items and Auction BIN items) - Fixed Price (Fixed Price, Multi-quantity Fixed Price, and Fixed Price with Best Offer Format items) - Classified Ads It does not restrict the duplicates based on itemid.
· Share
10 |600 characters needed characters left characters exceeded

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.

$$anonymous$$ avatar image
0 Likes"
$$anonymous$$ answered ·
I am not adding this filter in request, will include that and let you know about the results.
· Share
10 |600 characters needed characters left characters exceeded

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.

Write an Answer

Hint: Notify or tag a user in this post by typing @username.

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.