Page MenuHomePhabricator

pywikibot: Site.page_restrictions() is cached but not updated/cleared after calling Site.protect()
Closed, ResolvedPublic

Description

During the development of tests in T59602 , it was found that Site.page_restrictions() caches the pageinfo on the first call, however Site.protect() does not update or clear it, resulting in wrong data if you call page_restrictions() after Site.protect().

Example:

site = self.get_site()                                                                     
p1 = pywikibot.Page(site, u'User:Unicodesnowman/ProtectTest')                              

site.protect(protections=dict(edit='sysop', move='autoconfirmed'),                         
          page=p1,                                                                         
          reason='Pywikibot unit test')                                                    
self.assertEqual(site.page_restrictions(page=p1),                                          
                 dict([(u'edit', (u'sysop', u'infinity')),                                 
                       (u'move', (u'autoconfirmed', u'infinity'))]))                       
                                                                                           
site.protect(protections=dict(edit='', move=''),                                           
          page=p1,                                                                         
          reason='Pywikibot unit test')                                                    
self.assertEqual(site.page_restrictions(page=p1), dict())

(Fails on the last line)

Event Timeline

Unicodesnowman raised the priority of this task from to Needs Triage.
Unicodesnowman updated the task description. (Show Details)
Unicodesnowman changed Security from none to None.
Unicodesnowman subscribed.

Will adding " raise NoPage(page)" in protect() be enough to fix this?

Will adding " raise NoPage(page)" in protect() be enough to fix this?

No. The cache needs to be managed better.
First step. Find out which cache the bug is talking about.

Am I on the right track here? When Site.page_restrictions() is called the first time,

if not hasattr(page,"_protection"):

will return true so it calls

self.loadpageinfo(page)

Now when Site.page_restrictions() is called after Site.protect() ,

if not hasattr(page,"_protection"):

returns false since it already has the "_protection" attribute from the previous call and so, the page info is not loaded and the assert fails.
So we have to change this such that when Site.protect() is called, it either clears the page info for that page or updates it by calling loadpageinfo()

Yes, that is looking like you've understood the problem and are on the right path.

Change 268188 had a related patch set uploaded (by Dalba):
site.py: Update Page._protections after calling Site.protect()

https://gerrit.wikimedia.org/r/268188

Change 268188 merged by jenkins-bot:
site.py: Update Page._protections after calling Site.protect()

https://gerrit.wikimedia.org/r/268188