Page MenuHomePhabricator

GonzaGertrude (Gonza Gertrude Agatha)
User

Projects

User does not belong to any projects.

Today

  • Clear sailing ahead.

Tomorrow

  • Clear sailing ahead.

Saturday

  • Clear sailing ahead.

User Details

User Since
Oct 5 2023, 2:40 AM (38 w, 21 h)
Availability
Available
LDAP User
Unknown
MediaWiki User
GonzaGertrude [ Global Accounts ]

Recent Activity

Mar 28 2024

GonzaGertrude added a comment to T358095: Outreachy Application Task: Tutorial for Wikipedia language-agnostic article quality modeling data.

Where should the notebook be submitted?

Hey, it has to be submitted to the project mentors via mail.

Mar 28 2024, 5:01 PM · Outreachy (Round 28)

Mar 26 2024

GonzaGertrude added a comment to T358095: Outreachy Application Task: Tutorial for Wikipedia language-agnostic article quality modeling data.

Reminder from the project mentors: deadline to submit initial notebook today

We will be closing this project to new contributors today, March 22, at 9:00 PM UTC. Please see our previous message for details.

After today's deadline,

  • If you have specific questions about your notebook you are welcome to ask.
  • You may continue working on your notebook/final applications until the Outreachy deadline of April 2.
NOTE: Please email the link to your public PAWS notebook to all three of the mentors. Our email addresses are available on the Outeachy project page, in the contact info for "Isaac Johnson", "Caroline Myrick", and "Pablo Aragón".
Mar 26 2024, 5:03 AM · Outreachy (Round 28)

Mar 25 2024

GonzaGertrude added a comment to T358412: Create tool for informative infographics from structured information from Wikimedia projects - Task A.

Hello @Ederporto and everyone
How can I find my WikiUsername? In my case, I thought GonzaGertrude was my Wikiusername.

Try checking your PAWS link …/user/your_username/lab…
Can you confirm if it’s the same as GonzaGertrude

Mar 25 2024, 6:22 AM · Outreach-Programs-Projects, Outreachy (Round 28)
GonzaGertrude added a comment to T358412: Create tool for informative infographics from structured information from Wikimedia projects - Task A.

Hello @Ederporto and everyone
How can I find my WikiUsername? In my case, I thought GonzaGertrude was my Wikiusername.

Mar 25 2024, 5:11 AM · Outreach-Programs-Projects, Outreachy (Round 28)

Mar 16 2024

GonzaGertrude added a comment to T358412: Create tool for informative infographics from structured information from Wikimedia projects - Task A.

Hello @Ederporto or any of my colleagues,
Help me with this issue.

Mar 16 2024, 9:58 PM · Outreach-Programs-Projects, Outreachy (Round 28)
GonzaGertrude added a comment to T358412: Create tool for informative infographics from structured information from Wikimedia projects - Task A.

Hello @Ederporto ,
I am confused about the proposal as I don't know what to do.

Mar 16 2024, 8:04 PM · Outreach-Programs-Projects, Outreachy (Round 28)
GonzaGertrude added a comment to T358095: Outreachy Application Task: Tutorial for Wikipedia language-agnostic article quality modeling data.
Mar 16 2024, 6:10 PM · Outreachy (Round 28)

Mar 15 2024

GonzaGertrude added a comment to T358412: Create tool for informative infographics from structured information from Wikimedia projects - Task A.

Not able to generate video from the data frame? What might the reason be? As we have 7k+ columns and it's taking time can we generate a bar chart race video of only a few columns as an example?
And I read in the documentation that we should have 'ffmpeg' installed on our device as I am working on web paws Jupyter notebook and not locally do I also have to install 'ffmpeg'?

Mar 15 2024, 12:46 AM · Outreach-Programs-Projects, Outreachy (Round 28)

Mar 14 2024

GonzaGertrude added a comment to T358412: Create tool for informative infographics from structured information from Wikimedia projects - Task A.

Screenshot 2024-03-14 143830.png (511×1 px, 36 KB)

I have tried implementing bar-chart-race locally on my machine as well as also tried on PAWS as one of the applicant advised above , but unable to play the video. Can anyone here help me with this issue and rectify my error?
@Ederporto

When you play video what is the error. Does it play but shows gibberish?
Can you provide more info

@Mitumoni_kalita for now avoid filtering warining unless you are able to play the video since warnings/error might contain useful info.

Has the video file appeared in your folder?

yes, it hasappeared; only the video is not playing.

Mar 14 2024, 5:30 PM · Outreach-Programs-Projects, Outreachy (Round 28)
GonzaGertrude added a comment to T358412: Create tool for informative infographics from structured information from Wikimedia projects - Task A.

Screenshot 2024-03-14 143830.png (511×1 px, 36 KB)

I have tried implementing bar-chart-race locally on my machine as well as also tried on PAWS as one of the applicant advised above , but unable to play the video. Can anyone here help me with this issue and rectify my error?
@Ederporto

When you play video what is the error. Does it play but shows gibberish?
Can you provide more info

@Mitumoni_kalita for now avoid filtering warining unless you are able to play the video since warnings/error might contain useful info.

Mar 14 2024, 10:56 AM · Outreach-Programs-Projects, Outreachy (Round 28)

Mar 13 2024

GonzaGertrude added a comment to T358412: Create tool for informative infographics from structured information from Wikimedia projects - Task A.

@Aixvik, @Aananditaa001, @Anju_Maurya, @ElvisGicharu, @Abishekdascs, @Damiodunuga, @Abishek_Das, @Keamybams, @MahimaSinghal, @Andreas_Sune, @Udonels, @Omolade1414, @Chimezee, @BruceMahagwa, @GonzaGertrude, @Anachimuco, @DevJames1 and @Sheilakaruku

I have been abscent from answering questions here and through email, because I'm mostly interested in learning how you code and how you approach development challenges (they will appear in this and other projects in your journey).

But some of the questions and challenges I saw might need clarification, other have been resolved by kind and thoughtful individuals in this thread. Let's tackle the exercise here, and take in mind that you all will receive personal feedback in your submissions next week.

About the task instructions

  1. You should create an Wikimedia account, if you don't already have one. You can do so at https://meta.wikimedia.org/w/index.php?title=Special:CreateAccount;
  2. Log in into the PAWS service with your wikimedia account: https://paws.wmflabs.org/paws/hub;

These two seems to not be a challenge.

  1. Fork this notebook in your repository in PAWS (see the instructions here). Name your file as "T357409 - YourWikiUsername";
  2. Follow the specific directions in the notebook. If you have questions or need assistance, comment your inquiry in this subtask and make sure to ping @Ederporto;

You are not required to code in PAWs, feel free to do this locally, in your Jupyter Notebook instance, for example.
If the Data visualization doesn't work properly in your PAWs instance, but works locally, that's fine for me.
The Completing the gaps section needs to work in PAWs.

  1. Once you feel you have completed your task, generate the public link of your notebook on PAWS (see the instructions here) and send it through email to @Ederporto (you can find his email in Outreachy);

If you are developing locally, you can upload it to PAWs and then generate the public link and send me.

  1. You can request feedback on your task until March 17, and we will answer to you until March 22, in order to give you ample time to work on the feedback and your final submission;

That will happen next week, as announced

  1. Make sure to register the public link as a contribution on the Outreachy website. Your final contribution has to be submitted before April 2, at 4pm UTC.

You all have to make at least a draft proposal in the Outreachy platform by March 17th, as we will close new applications in the 18th, you can update it later, but your draft needs to be up by then.

About the notebook instructions:

  • Your first function is supposed to return a list of the most viewed articles in ptwiki for January;
    • You can check if your function is returning the articles in the correct order by looking at this page.
  • Your second function is supposed to return a dataframe of the most viewed articles in ptwiki for January and February;
  • You are free to use auxiliar functions and libraries as you wish;
  • Your third function, in the Data visualization section, is supposed to get the dataframe you generated in the second function and make a bar chart visualization of it.
    • You can pivot the table.

I hope this helps everyone!

Mar 13 2024, 11:54 PM · Outreach-Programs-Projects, Outreachy (Round 28)
GonzaGertrude added a comment to T358412: Create tool for informative infographics from structured information from Wikimedia projects - Task A.
Mar 13 2024, 11:53 PM · Outreach-Programs-Projects, Outreachy (Round 28)
GonzaGertrude added a comment to T358412: Create tool for informative infographics from structured information from Wikimedia projects - Task A.

Hello, @Ederporto

I have a question about visualisation step. On the notebook task we have to do our graphics with the previous result dataframe (top_view_dataframe). According to bar_chart_race documentation, the dataframe to use should have a date on row and different categories of articles on columns but top_view_dataframe is the opposite. My question is can we use different approach instead of top_view_dataframe, for exemple prepare our dataset with the build in method in bar_chart_race release for that goal?

Thank you

Yes, same issue. I find it hard to visualize it with the current state of our data frame. In the documentation, it's specified that every row must represent a single period, which is the exact opposite in ours.

@Ederporto would drop more insights.

I think to address the requirement of using the bar_chart_race library, we can reshape the DataFrame so that dates are on the rows and articles are on the columns. This way, it aligns with the expected format for the library.

@Andreas_Sune, @Udonels, @MahimaSinghal
You can pivot the dataframe inside your function, without any prejudice, as @MahimaSinghal suggested.

Mar 13 2024, 9:35 PM · Outreach-Programs-Projects, Outreachy (Round 28)
GonzaGertrude added a comment to T358412: Create tool for informative infographics from structured information from Wikimedia projects - Task A.

#TODO: add parameters as necessary
def most_viewed_ptwiki_jan():

  1. return a sorted list of the most viewed articles in the Portuguese Wikipedia from the top to the bottom url = "https://wikimedia.org/api/rest_v1/metrics/pageviews/top/pt.wikipedia.org/all-access/2024/01/all-days" headers = { "User-Agent": user_agent } try: response = requests.get(url, headers=headers) response.raise_for_status() # Raise an exception for 4xx or 5xx status codes data = response.json() articles = [article['article'] for article in data['items'][0]['articles']] return articles[:num_articles] except requests.RequestException as e: print("Error making request:", e) except ValueError as e: print("Error decoding JSON response:", e) print("Response content:", response.text)

wrote this function for getting most views in january but its giving error since yesterday and I am not able to understand my mistake.

The error might be due to the inclusion of the word "all-days" in the URL. This part is not required as it's not a valid parameter for the endpoint you are trying to access. Can you please share the documentation from where yo read to give this as a parameter .

the all-days word is valid if you want to get views for a whole month.

Can you please share the documentation where it is mentioned? I think i misunderstood something then because if you go this url: https://wikimedia.org/api/rest_v1/metrics/pageviews/top/pt.wikipedia.org/all-access/2024/01/all-days%22 , it says Given year/month/day is invalid date. So i thought that the all-days word may not be valid.

Mar 13 2024, 4:53 AM · Outreach-Programs-Projects, Outreachy (Round 28)
GonzaGertrude added a comment to T358412: Create tool for informative infographics from structured information from Wikimedia projects - Task A.

#TODO: add parameters as necessary
def most_viewed_ptwiki_jan():

  1. return a sorted list of the most viewed articles in the Portuguese Wikipedia from the top to the bottom url = "https://wikimedia.org/api/rest_v1/metrics/pageviews/top/pt.wikipedia.org/all-access/2024/01/all-days" headers = { "User-Agent": user_agent } try: response = requests.get(url, headers=headers) response.raise_for_status() # Raise an exception for 4xx or 5xx status codes data = response.json() articles = [article['article'] for article in data['items'][0]['articles']] return articles[:num_articles] except requests.RequestException as e: print("Error making request:", e) except ValueError as e: print("Error decoding JSON response:", e) print("Response content:", response.text)

wrote this function for getting most views in january but its giving error since yesterday and I am not able to understand my mistake.

The error might be due to the inclusion of the word "all-days" in the URL. This part is not required as it's not a valid parameter for the endpoint you are trying to access. Can you please share the documentation from where yo read to give this as a parameter .

the all-days word is valid if you want to get views for a whole month.

Mar 13 2024, 4:51 AM · Outreach-Programs-Projects, Outreachy (Round 28)
GonzaGertrude added a comment to T358412: Create tool for informative infographics from structured information from Wikimedia projects - Task A.

#TODO: add parameters as necessary
def most_viewed_ptwiki_jan():

  1. return a sorted list of the most viewed articles in the Portuguese Wikipedia from the top to the bottom url = "https://wikimedia.org/api/rest_v1/metrics/pageviews/top/pt.wikipedia.org/all-access/2024/01/all-days" headers = { "User-Agent": user_agent } try: response = requests.get(url, headers=headers) response.raise_for_status() # Raise an exception for 4xx or 5xx status codes data = response.json() articles = [article['article'] for article in data['items'][0]['articles']] return articles[:num_articles] except requests.RequestException as e: print("Error making request:", e) except ValueError as e: print("Error decoding JSON response:", e) print("Response content:", response.text)

wrote this function for getting most views in january but its giving error since yesterday and I am not able to understand my mistake.

The error might be due to the inclusion of the word "all-days" in the URL. This part is not required as it's not a valid parameter for the endpoint you are trying to access. Can you please share the documentation from where yo read to give this as a parameter .

the all-days word is valid if you want to get views for a whole month.

Mar 13 2024, 4:43 AM · Outreach-Programs-Projects, Outreachy (Round 28)

Mar 12 2024

GonzaGertrude added a comment to T358095: Outreachy Application Task: Tutorial for Wikipedia language-agnostic article quality modeling data.
Mar 12 2024, 8:19 AM · Outreachy (Round 28)
GonzaGertrude added a comment to T358095: Outreachy Application Task: Tutorial for Wikipedia language-agnostic article quality modeling data.

question.PNG (435×1 px, 47 KB)
Hello everyone. Please can anyone explain the idea or logic of having a revision for an article on the same day within short time intervals?
Could it be that it was reviewed then updated and reviewed again?

Mar 12 2024, 6:58 AM · Outreachy (Round 28)

Mar 11 2024

GonzaGertrude added a comment to T358095: Outreachy Application Task: Tutorial for Wikipedia language-agnostic article quality modeling data.

Should I focus solely on acquiring data for Albedo and Agriculture, or should I also consider using other page titles for data extraction? Additionally, I'm unsure about the frequency of data extraction—whether it should be hourly, monthly, or daily. Clarifying these aspects in task 1 of the microtask will help me progress comfortably to task 2. Your guidance on this matter would be greatly appreciated.

Mar 11 2024, 7:43 AM · Outreachy (Round 28)

Mar 10 2024

GonzaGertrude added a comment to T358095: Outreachy Application Task: Tutorial for Wikipedia language-agnostic article quality modeling data.

@Satoshi_Sh
When I use the start and end dates from revision timestamps for fetching the pageviews count for articles, it's giving the following error. Please help.

Screenshot 2024-03-10 014040.png (135×1 px, 28 KB)

The error complains about the datatype. The datatype of the revision_timestamp column is datetime64. You need to make it to string like "yyyymmdd". You might want to use something like df["revision_timestamp"].dt.strftime('%Y%m%d')

Screenshot from 2024-03-09 14-35-51.png (623×714 px, 65 KB)

I modified the code but then it gave this error

Screenshot 2024-03-11 002416.png (301×1 px, 95 KB)

Then I visited the link in error message and this was displayed for two articles named Albedo and Agriculture
Screenshot 2024-03-11 002434.png (38×1 px, 10 KB)

The error happens when there is no data at that timestamp. Use a try and catch block as suggested by others above.

Mar 10 2024, 7:34 PM · Outreachy (Round 28)
GonzaGertrude added a comment to T358095: Outreachy Application Task: Tutorial for Wikipedia language-agnostic article quality modeling data.
Mar 10 2024, 7:33 PM · Outreachy (Round 28)

Mar 9 2024

GonzaGertrude added a comment to T358095: Outreachy Application Task: Tutorial for Wikipedia language-agnostic article quality modeling data.

I am have having issues understanding the microtask, can someone shed more light about what the task is about.

Mar 9 2024, 11:29 PM · Outreachy (Round 28)
GonzaGertrude added a comment to T358095: Outreachy Application Task: Tutorial for Wikipedia language-agnostic article quality modeling data.

Hey @Satoshi_Sh, would you share the full link to `python-mwviews/src/mwviews/api

/pageviews.py`

This is the one.
https://github.com/mediawiki-utilities/python-mwviews/blob/main/src/mwviews/api/pageviews.py

Mar 9 2024, 4:25 PM · Outreachy (Round 28)
GonzaGertrude added a comment to T358095: Outreachy Application Task: Tutorial for Wikipedia language-agnostic article quality modeling data.

Hello Mentors @Pablo @Isaac @CMyrick-WMF
I'm encountering an error with the code below when changing the granularity from "daily" to "hourly".

p.article_views('en.wikipedia', ['Albedo', 'Agriculture'], granularity='hourly', start='20220101', end='20220103')

Error :- 'The pageview API returned nothing useful.

I will highly appreciate your assistance in this.

Hi Shreyash,

This is the source code from `python-mwviews/src/mwviews/api
/pageviews.py`. I believe they don't have any available data within your selected timeframe. That's why you get the error message. You can try another timeframe. Hope this helps your question.

try:
           results = self.get_concurrent(urls)
           some_data_returned = False
           for result in results:
               if 'items' in result:
                   some_data_returned = True
               else:
                   continue
               for item in result['items']:
                   output[parse_date(item['timestamp'])][item['project']] = item['views']

           if not some_data_returned:
               raise Exception(
                   'The pageview API returned nothing useful at: {}'.format(urls)
               )
           return output

Hey @Satoshi_Sh, would you share the full link to `python-mwviews/src/mwviews/api

/pageviews.py`

Mar 9 2024, 3:54 PM · Outreachy (Round 28)
GonzaGertrude added a comment to T358095: Outreachy Application Task: Tutorial for Wikipedia language-agnostic article quality modeling data.
Mar 9 2024, 3:51 PM · Outreachy (Round 28)

Mar 8 2024

GonzaGertrude added a comment to T358412: Create tool for informative infographics from structured information from Wikimedia projects - Task A.

Hello everyone, I am also an Outreachy applicant for 2024.

Today, I discovered the message on Phabricator; however, I initially assumed that all discussions were taking place in the Zulip channel. Consequently, I didn't check the Phabricator comments section.

To keep it concise, I'd like to address how to make FFmpeg work on the PAWS Jupyter Notebook for the task "Create a tool for informative infographics from structured information from Wikimedia projects - Task A."

The reason FFmpeg isn't functioning on the PAWS Jupyter Notebook is that we need to download and add FFmpeg Static Builds from (https://johnvansickle.com/ffmpeg/) to the same folder where we have the code.

Here's a step-by-step guide (So, you don't have to go through the trouble of downloading from https://johnvansickle.com/ffmpeg/):

Note: Before you do the step mentioned in point (a), make sure all the steps, i.e., b, c, d, and e, are done first.

a) I've attached code that you can add to your Jupyter Notebook cell (Same notebook where you have your code to generate the bar chart race). Run this code to resolve the FFmpeg issue.

# Download a static FFmpeg build and add it to PATH.
%run 'util/load-ffmpeg.ipynb'
print('Done!')

b) Prior to running the code mentioned in point (a), add/upload the "util" folder to your PAWS. I've included the folder below.


(You, have to unzip it after downloading)

c) The purpose of the "util" folder is to automatically add the FFmpeg Static Build File (which is a folder) to your PAWS when you run the provided code mentioned in point (a).

d) Ensure that the filename in bcr.bar_chart_race() has a ".mp4" extension.

e) After completing these steps, you can run your respective code, which is the code for generating the bar chart race.

Note:

a) You might encounter a warning / Error (Which again doesn't appear when I run my code locally and only sometimes appears on my PAWS Jupyter Notebook), as shown in the attached screenshot. However, this is not an issue, as the video file will be generated in the PAWS folder after running your code. You can then download the bar chart race video and watch the video(as shown the screenshot below).

Code for ffmpeg.png (239×992 px, 48 KB)

mp 4.png (32×418 px, 2 KB)

b) If your code works correctly locally, it should generally (90%-99% of the time) work on the PAWS online Jupyter Notebook.

c) The warning or error screenshot I provided may or may not appear (Which happens to me only on PAWS), so be mindful of that.

d) Ensure that the filename in bcr.bar_chart_race() has a ".mp4" extension, as the ".html" filename won't appear on PAWS. But, Again, the .html works locally.

e) Why the .html doesn't appear on PAWS, I have no idea about it, and I have still not looked for a solution related to .html since the .mp4 file is generated on PAWS without any issue.

f) All the things I have mentioned on how to solve the issue related to FFmpeg were taken from various documentation like Matplotlib 3.8.3 documentation and, of course, my favorite stack overflow (So, thanks to the Devs on Stack Overflow).

Mar 8 2024, 9:16 AM · Outreach-Programs-Projects, Outreachy (Round 28)

Mar 7 2024

GonzaGertrude added a comment to T358412: Create tool for informative infographics from structured information from Wikimedia projects - Task A.

@Ederporto
I am having issues implementing this function

most_viewed_ptwiki_jan_feb_per_day():

as in the task description.

If I understood properly we are asked to get daily data for each article within January and February in the Portuguese Wikipedia. And then append in a DataFrame

My limitations:

  • No endpoint to fetch the most viewed articles daily or monthly in a project for a date range. The closest is this endpoint
https://wikimedia.org/api/rest_v1/metrics/pageviews/aggregate/{project}/all-access/all-agents/daily/{start}/{end}

but only returns views for a project cumulatively and not individual articles.

  • Another endpoint I tried is this:
https://wikimedia.org/api/rest_v1/metrics/pageviews/top/{project}/all-access/year/month/day

This returns pageviews for articles on a project for a particular day or month(not date range).
So to use this to solve the task, I will have to make almost 60 requests each time trying to get articles for each day up to two months which is not efficient enough.

So any help will be appreciated, my fellow interns you can help if you find a way to go about it or if you feel I misunderstood the task description.

@Ederporto, can you please clarify this? I have a sense that I also misunderstood the task.

Mar 7 2024, 2:19 AM · Outreach-Programs-Projects, Outreachy (Round 28)

Mar 6 2024

GonzaGertrude added a comment to T358412: Create tool for informative infographics from structured information from Wikimedia projects - Task A.

Hi @Ederporto,

I'm facing some trouble gathering data from the Wikimedia API. Every time I make a request in the Jupyter notebook, I'm getting a response status code of 403, indicating forbidden access. Is there a workaround for this? It's worth noting that I can gather data normally from the Wikimedia API website and even when running the link outside of the notebook.

Could you please take a look at the error and see if there's a solution? Thank you

Screenshot 2024-03-05 at 22.19.26.png (850×2 px, 251 KB)

I am having the same issue with this endpoint: https://wikimedia.org/api/rest_v1/metrics/pageviews/top/{project}/{access}/{year}/{month}/{day}
it returns 403. But after getting the text property of the response these were the indicatives.

  1. Our servers are currently under maintenance or experiencing a technical problem.
  2. Error: 403, Scripted requests from your IP have been blocked, please see https://meta.wikimedia.org/wiki/User-Agent_policy.

Assistance is needed to continue with the tasks
OS: Windows
python module: requests

Eror_Making_get_requests.jpg (481×1 px, 117 KB)

Mar 6 2024, 3:14 PM · Outreach-Programs-Projects, Outreachy (Round 28)
GonzaGertrude added a comment to T358412: Create tool for informative infographics from structured information from Wikimedia projects - Task A.

Hi @Ederporto, I hope you're doing well!

I was going through the project tasks and had a question I wanted to clarify:

When it mentions "the most viewed articles in the Portuguese Wikipedia," is it specifically referring to Brazil, a Portuguese-speaking country, or does it encompass all Portuguese-speaking countries? Just a bit confused about the wording there.

Mar 6 2024, 2:04 AM · Outreach-Programs-Projects, Outreachy (Round 28)
GonzaGertrude added a comment to T358412: Create tool for informative infographics from structured information from Wikimedia projects - Task A.

Hello @Ederporto,
I can't access the Wikimedia API document. The link in the notebook shows this page.

Api.JPG (366×1 px, 19 KB)

I shall appreciate your help or that of any member here.

Mar 6 2024, 1:33 AM · Outreach-Programs-Projects, Outreachy (Round 28)
GonzaGertrude added a comment to T358412: Create tool for informative infographics from structured information from Wikimedia projects - Task A.

Hello @Ederporto,
I can't access the Wikimedia API document. The link in the notebook shows this page.

Api.JPG (366×1 px, 19 KB)

I shall appreciate your help or that of any member here.

Mar 6 2024, 12:32 AM · Outreach-Programs-Projects, Outreachy (Round 28)

Mar 5 2024

GonzaGertrude updated GonzaGertrude.
Mar 5 2024, 2:42 PM

Oct 16 2023

GonzaGertrude added a comment to T347405: WikiCurricula, an interface to represent curricula data.

Hello, @GonzaGertrude
Dynamically fetch curriculum data from the API. for example when a user is viewing Uruguay's curricula and clicks on the link which goes to ghana.html, It would be cool If the API fucntion (query.py) "sensed" this and made a call to Wikidata.
Should I open an issue about that?
-> to answer this question, the problem that I see here is that the excecution of bot.py takes quite a long time.

I think that the Wikidata query and the excecution of bot.py should be done automatically but scheduled (monthly or weekly or daily). This should give us 2 voci files, one for Uruguay and one for Ghana, which are periodically updated.

Then, when the user clicks on the link, the visualization will switch between ghana's voci file and uruguay's voci file.

What do you think? If it seems sensible, please go ahead and create the issue :D

Oct 16 2023, 11:42 AM · Outreach-Programs-Projects, Outreachy (Round 27), Wikidata

Oct 9 2023

GonzaGertrude added a comment to T347405: WikiCurricula, an interface to represent curricula data.

Hello @Piracalamina @SPatnaik, I realized that my first implementation of Ghana's curriculum wasn't right so I have reimplemented it. I have updated the README with the steps I took. I have also added an html file called ghana.html, which display's Ghana's curriculum. Then, I added an href in index.html to link to ghana.html.

Oct 9 2023, 6:43 PM · Outreach-Programs-Projects, Outreachy (Round 27), Wikidata
GonzaGertrude added a comment to T347405: WikiCurricula, an interface to represent curricula data.

Hi @Piracalamina, I notice this task "Write documentation at the Readme file on how to feed the visualization with data from a new curriculum. The instructions should include building and running the Wikidata query. Remember that, for the time being, only Uruguay's and Ghana's curriculum are structured in Wikidata." has not been done and yet it is ticked. Should I do it?

Oct 9 2023, 1:47 PM · Outreach-Programs-Projects, Outreachy (Round 27), Wikidata

Oct 7 2023

GonzaGertrude added a comment to T347405: WikiCurricula, an interface to represent curricula data.

Hi @Piracalamina, I have implemented and deployed Wikicurricula for Ghana's national curriculum, with reference to the English Wikipedia. Furthermore, I have documented the process in the README.md. Here is a link to my PR https://github.com/wikicurricula-uy/wikicurricula-boilerplate/pull/35

Oct 7 2023, 11:09 PM · Outreach-Programs-Projects, Outreachy (Round 27), Wikidata

Oct 6 2023

GonzaGertrude added a comment to T347405: WikiCurricula, an interface to represent curricula data.

Hello @Piracalamina, I have translated variables and function names from italian to English. Here is the link https://github.com/wikicurricula-uy/wikicurricula-boilerplate/pull/20. In case of any errors, let me know so I can fix them.
Thanks

Oct 6 2023, 1:58 PM · Outreach-Programs-Projects, Outreachy (Round 27), Wikidata
GonzaGertrude added a comment to T347405: WikiCurricula, an interface to represent curricula data.

Hey @282001, please clone this repo https://github.com/wikicurricula-uy/wikicurricula-boilerplate. Then pick a micro task above that has not yet been checked and then do it. You can also check the issues tab in the aforementioned repo, there are open issues.
All the best!

Oct 6 2023, 12:10 PM · Outreach-Programs-Projects, Outreachy (Round 27), Wikidata

Oct 5 2023

GonzaGertrude added a comment to T347405: WikiCurricula, an interface to represent curricula data.

Hello, I am Gonza Gertrude. I would like to contribute to this project. Thanks

Hi and thank you for your interest! Please check thoroughly https://www.mediawiki.org/wiki/New_Developers (and all of its communication section!). The page covers how to get started, assigning tasks, task status, how to find a codebase, how to create patches, where to ask general development questions and where to get help with setup problems, and how to ask good questions. Thanks a lot! :)

Oct 5 2023, 2:48 AM · Outreach-Programs-Projects, Outreachy (Round 27), Wikidata