I'm hoping the old p.g.o will be 'usable' once it stops getting traffic and we can compare. My suspicion is that the new p.g.o uses split indicies (for ES6) and these splits are causing "duplicates". They are not specifically duplicates (each entry corresponds to a change) but its possible the old index allowed for some kind of aggregation or rollup that made the display less confusing. 1) Confirm that the filters are working, e.g. "keyword" feeds should only show keyword changes? I'm not convinced this is working as intended. 2) Unrelated to 1, it appears that that aggregation might be needed to group multiple changes for a given CP into 1 'event' for some views.
So I think partly this is because the timestamps are from indexing (when ingested into index) as opposed from actual git timestamps; so the ordering is based entirely on indices and not git timestamps. This makes sense based on the query: def stabled_packages Rails.cache.fetch('stabled_packages', expires_in: 10.minutes) do Change.find_all_by(:change_type, 'stable', { size: 50, sort: { created_at: { order: 'desc' } } }).map do |change| change.to_os(:change_type, :package, :category, :version, :arches, :created_at) end end end So ideally we would record the git commit ts at import time and re-sort the changes. This does not necessarily address the duplicate entries though.
*** Bug 607646 has been marked as a duplicate of this bug. ***
We rewrote the entire application.