In T358095#9670338, @Pratham_Kandari wrote:In T358095#9669653, @Chimaobichisom65 wrote:Where should the notebook be submitted?
Hey, it has to be submitted to the project mentors via mail.
- Queries
- All Stories
- Search
- Advanced Search
- Transactions
- Transaction Logs
Feed Advanced Search
Advanced Search
Advanced Search
Mar 28 2024
Mar 28 2024
GonzaGertrude added a comment to T358095: Outreachy Application Task: Tutorial for Wikipedia language-agnostic article quality modeling data.
Mar 26 2024
Mar 26 2024
GonzaGertrude added a comment to T358095: Outreachy Application Task: Tutorial for Wikipedia language-agnostic article quality modeling data.
In T358095#9653903, @CMyrick-WMF wrote:Reminder from the project mentors: deadline to submit initial notebook today
We will be closing this project to new contributors today, March 22, at 9:00 PM UTC. Please see our previous message for details.
After today's deadline,
- If you have specific questions about your notebook you are welcome to ask.
- You may continue working on your notebook/final applications until the Outreachy deadline of April 2.
NOTE: Please email the link to your public PAWS notebook to all three of the mentors. Our email addresses are available on the Outeachy project page, in the contact info for "Isaac Johnson", "Caroline Myrick", and "Pablo Aragón".
Mar 25 2024
Mar 25 2024
GonzaGertrude added a comment to T358412: Create tool for informative infographics from structured information from Wikimedia projects - Task A.
In T358412#9656651, @DevJames1 wrote:In T358412#9656644, @GonzaGertrude wrote:Hello @Ederporto and everyone
How can I find my WikiUsername? In my case, I thought GonzaGertrude was my Wikiusername.Try checking your PAWS link …/user/your_username/lab…
Can you confirm if it’s the same as GonzaGertrude
GonzaGertrude added a comment to T358412: Create tool for informative infographics from structured information from Wikimedia projects - Task A.
Hello @Ederporto and everyone
How can I find my WikiUsername? In my case, I thought GonzaGertrude was my Wikiusername.
Mar 16 2024
Mar 16 2024
GonzaGertrude added a comment to T358412: Create tool for informative infographics from structured information from Wikimedia projects - Task A.
Hello @Ederporto or any of my colleagues,
Help me with this issue.
GonzaGertrude added a comment to T358412: Create tool for informative infographics from structured information from Wikimedia projects - Task A.
Hello @Ederporto ,
I am confused about the proposal as I don't know what to do.
Mar 15 2024
Mar 15 2024
GonzaGertrude added a comment to T358412: Create tool for informative infographics from structured information from Wikimedia projects - Task A.
In T358412#9632678, @Aixvik wrote:Not able to generate video from the data frame? What might the reason be? As we have 7k+ columns and it's taking time can we generate a bar chart race video of only a few columns as an example?
And I read in the documentation that we should have 'ffmpeg' installed on our device as I am working on web paws Jupyter notebook and not locally do I also have to install 'ffmpeg'?
Mar 14 2024
Mar 14 2024
GonzaGertrude added a comment to T358412: Create tool for informative infographics from structured information from Wikimedia projects - Task A.
In T358412#9631809, @Mitumoni_kalita wrote:In T358412#9629628, @GonzaGertrude wrote:In T358412#9629419, @Abishekdascs wrote:In T358412#9629390, @DevJames1 wrote:In T358412#9629375, @Mitumoni_kalita wrote:
I have tried implementing bar-chart-race locally on my machine as well as also tried on PAWS as one of the applicant advised above , but unable to play the video. Can anyone here help me with this issue and rectify my error?
@EderportoWhen you play video what is the error. Does it play but shows gibberish?
Can you provide more info@Mitumoni_kalita for now avoid filtering warining unless you are able to play the video since warnings/error might contain useful info.
Has the video file appeared in your folder?
yes, it hasappeared; only the video is not playing.
GonzaGertrude added a comment to T358412: Create tool for informative infographics from structured information from Wikimedia projects - Task A.
In T358412#9629419, @Abishekdascs wrote:In T358412#9629390, @DevJames1 wrote:In T358412#9629375, @Mitumoni_kalita wrote:
I have tried implementing bar-chart-race locally on my machine as well as also tried on PAWS as one of the applicant advised above , but unable to play the video. Can anyone here help me with this issue and rectify my error?
@EderportoWhen you play video what is the error. Does it play but shows gibberish?
Can you provide more info@Mitumoni_kalita for now avoid filtering warining unless you are able to play the video since warnings/error might contain useful info.
Mar 13 2024
Mar 13 2024
GonzaGertrude added a comment to T358412: Create tool for informative infographics from structured information from Wikimedia projects - Task A.
In T358412#9628885, @Ederporto wrote:@Aixvik, @Aananditaa001, @Anju_Maurya, @ElvisGicharu, @Abishekdascs, @Damiodunuga, @Abishek_Das, @Keamybams, @MahimaSinghal, @Andreas_Sune, @Udonels, @Omolade1414, @Chimezee, @BruceMahagwa, @GonzaGertrude, @Anachimuco, @DevJames1 and @Sheilakaruku
I have been abscent from answering questions here and through email, because I'm mostly interested in learning how you code and how you approach development challenges (they will appear in this and other projects in your journey).
But some of the questions and challenges I saw might need clarification, other have been resolved by kind and thoughtful individuals in this thread. Let's tackle the exercise here, and take in mind that you all will receive personal feedback in your submissions next week.
About the task instructions
- You should create an Wikimedia account, if you don't already have one. You can do so at https://meta.wikimedia.org/w/index.php?title=Special:CreateAccount;
- Log in into the PAWS service with your wikimedia account: https://paws.wmflabs.org/paws/hub;
These two seems to not be a challenge.
- Fork this notebook in your repository in PAWS (see the instructions here). Name your file as "T357409 - YourWikiUsername";
- Follow the specific directions in the notebook. If you have questions or need assistance, comment your inquiry in this subtask and make sure to ping @Ederporto;
You are not required to code in PAWs, feel free to do this locally, in your Jupyter Notebook instance, for example.
If the Data visualization doesn't work properly in your PAWs instance, but works locally, that's fine for me.
The Completing the gaps section needs to work in PAWs.
- Once you feel you have completed your task, generate the public link of your notebook on PAWS (see the instructions here) and send it through email to @Ederporto (you can find his email in Outreachy);
If you are developing locally, you can upload it to PAWs and then generate the public link and send me.
- You can request feedback on your task until March 17, and we will answer to you until March 22, in order to give you ample time to work on the feedback and your final submission;
That will happen next week, as announced
- Make sure to register the public link as a contribution on the Outreachy website. Your final contribution has to be submitted before April 2, at 4pm UTC.
You all have to make at least a draft proposal in the Outreachy platform by March 17th, as we will close new applications in the 18th, you can update it later, but your draft needs to be up by then.
About the notebook instructions:
- Your first function is supposed to return a list of the most viewed articles in ptwiki for January;
- You can check if your function is returning the articles in the correct order by looking at this page.
- Your second function is supposed to return a dataframe of the most viewed articles in ptwiki for January and February;
- You are free to use auxiliar functions and libraries as you wish;
- Your third function, in the Data visualization section, is supposed to get the dataframe you generated in the second function and make a bar chart visualization of it.
- You can pivot the table.
I hope this helps everyone!
GonzaGertrude added a comment to T358412: Create tool for informative infographics from structured information from Wikimedia projects - Task A.
In T358412#9628706, @Ederporto wrote:In T358412#9612803, @MahimaSinghal wrote:In T358412#9611792, @Udonels wrote:In T358412#9611573, @Andreas_Sune wrote:Hello, @Ederporto
I have a question about visualisation step. On the notebook task we have to do our graphics with the previous result dataframe (top_view_dataframe). According to bar_chart_race documentation, the dataframe to use should have a date on row and different categories of articles on columns but top_view_dataframe is the opposite. My question is can we use different approach instead of top_view_dataframe, for exemple prepare our dataset with the build in method in bar_chart_race release for that goal?
Thank you
Yes, same issue. I find it hard to visualize it with the current state of our data frame. In the documentation, it's specified that every row must represent a single period, which is the exact opposite in ours.
@Ederporto would drop more insights.
I think to address the requirement of using the bar_chart_race library, we can reshape the DataFrame so that dates are on the rows and articles are on the columns. This way, it aligns with the expected format for the library.
@Andreas_Sune, @Udonels, @MahimaSinghal
You can pivot the dataframe inside your function, without any prejudice, as @MahimaSinghal suggested.
GonzaGertrude added a comment to T358412: Create tool for informative infographics from structured information from Wikimedia projects - Task A.
In T358412#9625552, @MahimaSinghal wrote:In T358412#9625551, @GonzaGertrude wrote:In T358412#9625476, @MahimaSinghal wrote:In T358412#9625427, @Aixvik wrote:#TODO: add parameters as necessary
def most_viewed_ptwiki_jan():
- return a sorted list of the most viewed articles in the Portuguese Wikipedia from the top to the bottom url = "https://wikimedia.org/api/rest_v1/metrics/pageviews/top/pt.wikipedia.org/all-access/2024/01/all-days" headers = { "User-Agent": user_agent } try: response = requests.get(url, headers=headers) response.raise_for_status() # Raise an exception for 4xx or 5xx status codes data = response.json() articles = [article['article'] for article in data['items'][0]['articles']] return articles[:num_articles] except requests.RequestException as e: print("Error making request:", e) except ValueError as e: print("Error decoding JSON response:", e) print("Response content:", response.text)
wrote this function for getting most views in january but its giving error since yesterday and I am not able to understand my mistake.
The error might be due to the inclusion of the word "all-days" in the URL. This part is not required as it's not a valid parameter for the endpoint you are trying to access. Can you please share the documentation from where yo read to give this as a parameter .
the all-days word is valid if you want to get views for a whole month.
Can you please share the documentation where it is mentioned? I think i misunderstood something then because if you go this url: https://wikimedia.org/api/rest_v1/metrics/pageviews/top/pt.wikipedia.org/all-access/2024/01/all-days%22 , it says Given year/month/day is invalid date. So i thought that the all-days word may not be valid.
GonzaGertrude added a comment to T358412: Create tool for informative infographics from structured information from Wikimedia projects - Task A.
In T358412#9625551, @GonzaGertrude wrote:In T358412#9625476, @MahimaSinghal wrote:In T358412#9625427, @Aixvik wrote:#TODO: add parameters as necessary
def most_viewed_ptwiki_jan():
- return a sorted list of the most viewed articles in the Portuguese Wikipedia from the top to the bottom url = "https://wikimedia.org/api/rest_v1/metrics/pageviews/top/pt.wikipedia.org/all-access/2024/01/all-days" headers = { "User-Agent": user_agent } try: response = requests.get(url, headers=headers) response.raise_for_status() # Raise an exception for 4xx or 5xx status codes data = response.json() articles = [article['article'] for article in data['items'][0]['articles']] return articles[:num_articles] except requests.RequestException as e: print("Error making request:", e) except ValueError as e: print("Error decoding JSON response:", e) print("Response content:", response.text)
wrote this function for getting most views in january but its giving error since yesterday and I am not able to understand my mistake.
The error might be due to the inclusion of the word "all-days" in the URL. This part is not required as it's not a valid parameter for the endpoint you are trying to access. Can you please share the documentation from where yo read to give this as a parameter .
the all-days word is valid if you want to get views for a whole month.
GonzaGertrude added a comment to T358412: Create tool for informative infographics from structured information from Wikimedia projects - Task A.
In T358412#9625476, @MahimaSinghal wrote:In T358412#9625427, @Aixvik wrote:#TODO: add parameters as necessary
def most_viewed_ptwiki_jan():
- return a sorted list of the most viewed articles in the Portuguese Wikipedia from the top to the bottom url = "https://wikimedia.org/api/rest_v1/metrics/pageviews/top/pt.wikipedia.org/all-access/2024/01/all-days" headers = { "User-Agent": user_agent } try: response = requests.get(url, headers=headers) response.raise_for_status() # Raise an exception for 4xx or 5xx status codes data = response.json() articles = [article['article'] for article in data['items'][0]['articles']] return articles[:num_articles] except requests.RequestException as e: print("Error making request:", e) except ValueError as e: print("Error decoding JSON response:", e) print("Response content:", response.text)
wrote this function for getting most views in january but its giving error since yesterday and I am not able to understand my mistake.
The error might be due to the inclusion of the word "all-days" in the URL. This part is not required as it's not a valid parameter for the endpoint you are trying to access. Can you please share the documentation from where yo read to give this as a parameter .
the all-days word is valid if you want to get views for a whole month.
Mar 12 2024
Mar 12 2024
GonzaGertrude added a comment to T358095: Outreachy Application Task: Tutorial for Wikipedia language-agnostic article quality modeling data.
In T358095#9622547, @Funmi_Makinde wrote:
Mar 11 2024
Mar 11 2024
GonzaGertrude added a comment to T358095: Outreachy Application Task: Tutorial for Wikipedia language-agnostic article quality modeling data.
In T358095#9618873, @Bilal171 wrote:Should I focus solely on acquiring data for Albedo and Agriculture, or should I also consider using other page titles for data extraction? Additionally, I'm unsure about the frequency of data extraction—whether it should be hourly, monthly, or daily. Clarifying these aspects in task 1 of the microtask will help me progress comfortably to task 2. Your guidance on this matter would be greatly appreciated.
Mar 10 2024
Mar 10 2024
GonzaGertrude added a comment to T358095: Outreachy Application Task: Tutorial for Wikipedia language-agnostic article quality modeling data.
In T358095#9618569, @Shruti799 wrote:In T358095#9617990, @Satoshi_Sh wrote:In T358095#9617983, @Shruti799 wrote:@Satoshi_Sh
When I use the start and end dates from revision timestamps for fetching the pageviews count for articles, it's giving the following error. Please help.
The error complains about the datatype. The datatype of the revision_timestamp column is datetime64. You need to make it to string like "yyyymmdd". You might want to use something like df["revision_timestamp"].dt.strftime('%Y%m%d')
I modified the code but then it gave this error
Then I visited the link in error message and this was displayed for two articles named Albedo and Agriculture
The error happens when there is no data at that timestamp. Use a try and catch block as suggested by others above.
Mar 9 2024
Mar 9 2024
GonzaGertrude added a comment to T358095: Outreachy Application Task: Tutorial for Wikipedia language-agnostic article quality modeling data.
In T358095#9617956, @Mary_agiamah wrote:I am have having issues understanding the microtask, can someone shed more light about what the task is about.
GonzaGertrude added a comment to T358095: Outreachy Application Task: Tutorial for Wikipedia language-agnostic article quality modeling data.
In T358095#9617837, @Satoshi_Sh wrote:Hey @Satoshi_Sh, would you share the full link to `python-mwviews/src/mwviews/api
/pageviews.py`
This is the one.
https://github.com/mediawiki-utilities/python-mwviews/blob/main/src/mwviews/api/pageviews.py
GonzaGertrude added a comment to T358095: Outreachy Application Task: Tutorial for Wikipedia language-agnostic article quality modeling data.
In T358095#9604113, @Satoshi_Sh wrote:In T358095#9603538, @ASHreyash7 wrote:Hello Mentors @Pablo @Isaac @CMyrick-WMF
I'm encountering an error with the code below when changing the granularity from "daily" to "hourly".p.article_views('en.wikipedia', ['Albedo', 'Agriculture'], granularity='hourly', start='20220101', end='20220103')
Error :- 'The pageview API returned nothing useful.
I will highly appreciate your assistance in this.
Hi Shreyash,
This is the source code from `python-mwviews/src/mwviews/api
/pageviews.py`. I believe they don't have any available data within your selected timeframe. That's why you get the error message. You can try another timeframe. Hope this helps your question.try: results = self.get_concurrent(urls) some_data_returned = False for result in results: if 'items' in result: some_data_returned = True else: continue for item in result['items']: output[parse_date(item['timestamp'])][item['project']] = item['views'] if not some_data_returned: raise Exception( 'The pageview API returned nothing useful at: {}'.format(urls) ) return output
Hey @Satoshi_Sh, would you share the full link to `python-mwviews/src/mwviews/api
/pageviews.py`
Mar 8 2024
Mar 8 2024
GonzaGertrude added a comment to T358412: Create tool for informative infographics from structured information from Wikimedia projects - Task A.
In T358412#9614248, @Abishek_Das wrote:Hello everyone, I am also an Outreachy applicant for 2024.
Today, I discovered the message on Phabricator; however, I initially assumed that all discussions were taking place in the Zulip channel. Consequently, I didn't check the Phabricator comments section.
To keep it concise, I'd like to address how to make FFmpeg work on the PAWS Jupyter Notebook for the task "Create a tool for informative infographics from structured information from Wikimedia projects - Task A."
The reason FFmpeg isn't functioning on the PAWS Jupyter Notebook is that we need to download and add FFmpeg Static Builds from (https://johnvansickle.com/ffmpeg/) to the same folder where we have the code.
Here's a step-by-step guide (So, you don't have to go through the trouble of downloading from https://johnvansickle.com/ffmpeg/):
Note: Before you do the step mentioned in point (a), make sure all the steps, i.e., b, c, d, and e, are done first.
a) I've attached code that you can add to your Jupyter Notebook cell (Same notebook where you have your code to generate the bar chart race). Run this code to resolve the FFmpeg issue.
# Download a static FFmpeg build and add it to PATH. %run 'util/load-ffmpeg.ipynb' print('Done!')b) Prior to running the code mentioned in point (a), add/upload the "util" folder to your PAWS. I've included the folder below.
util.zip864 BDownload
(You, have to unzip it after downloading)c) The purpose of the "util" folder is to automatically add the FFmpeg Static Build File (which is a folder) to your PAWS when you run the provided code mentioned in point (a).
d) Ensure that the filename in bcr.bar_chart_race() has a ".mp4" extension.
e) After completing these steps, you can run your respective code, which is the code for generating the bar chart race.
Note:
a) You might encounter a warning / Error (Which again doesn't appear when I run my code locally and only sometimes appears on my PAWS Jupyter Notebook), as shown in the attached screenshot. However, this is not an issue, as the video file will be generated in the PAWS folder after running your code. You can then download the bar chart race video and watch the video(as shown the screenshot below).
b) If your code works correctly locally, it should generally (90%-99% of the time) work on the PAWS online Jupyter Notebook.
c) The warning or error screenshot I provided may or may not appear (Which happens to me only on PAWS), so be mindful of that.
d) Ensure that the filename in bcr.bar_chart_race() has a ".mp4" extension, as the ".html" filename won't appear on PAWS. But, Again, the .html works locally.
e) Why the .html doesn't appear on PAWS, I have no idea about it, and I have still not looked for a solution related to .html since the .mp4 file is generated on PAWS without any issue.
f) All the things I have mentioned on how to solve the issue related to FFmpeg were taken from various documentation like Matplotlib 3.8.3 documentation and, of course, my favorite stack overflow (So, thanks to the Devs on Stack Overflow).
Mar 7 2024
Mar 7 2024
GonzaGertrude added a comment to T358412: Create tool for informative infographics from structured information from Wikimedia projects - Task A.
In T358412#9608666, @Anachimuco wrote:In T358412#9607774, @DevJames1 wrote:@Ederporto
I am having issues implementing this functionmost_viewed_ptwiki_jan_feb_per_day():as in the task description.
If I understood properly we are asked to get daily data for each article within January and February in the Portuguese Wikipedia. And then append in a DataFrame
My limitations:
- No endpoint to fetch the most viewed articles daily or monthly in a project for a date range. The closest is this endpoint
https://wikimedia.org/api/rest_v1/metrics/pageviews/aggregate/{project}/all-access/all-agents/daily/{start}/{end}but only returns views for a project cumulatively and not individual articles.
- Another endpoint I tried is this:
https://wikimedia.org/api/rest_v1/metrics/pageviews/top/{project}/all-access/year/month/dayThis returns pageviews for articles on a project for a particular day or month(not date range).
So to use this to solve the task, I will have to make almost 60 requests each time trying to get articles for each day up to two months which is not efficient enough.So any help will be appreciated, my fellow interns you can help if you find a way to go about it or if you feel I misunderstood the task description.
@Ederporto, can you please clarify this? I have a sense that I also misunderstood the task.
Mar 6 2024
Mar 6 2024
GonzaGertrude added a comment to T358412: Create tool for informative infographics from structured information from Wikimedia projects - Task A.
In T358412#9605389, @DevJames1 wrote:In T358412#9604805, @Anachimuco wrote:Hi @Ederporto,
I'm facing some trouble gathering data from the Wikimedia API. Every time I make a request in the Jupyter notebook, I'm getting a response status code of 403, indicating forbidden access. Is there a workaround for this? It's worth noting that I can gather data normally from the Wikimedia API website and even when running the link outside of the notebook.
Could you please take a look at the error and see if there's a solution? Thank you
I am having the same issue with this endpoint: https://wikimedia.org/api/rest_v1/metrics/pageviews/top/{project}/{access}/{year}/{month}/{day}
it returns 403. But after getting the text property of the response these were the indicatives.
- Our servers are currently under maintenance or experiencing a technical problem.
- Error: 403, Scripted requests from your IP have been blocked, please see https://meta.wikimedia.org/wiki/User-Agent_policy.
Assistance is needed to continue with the tasks
OS: Windows
python module: requests
GonzaGertrude added a comment to T358412: Create tool for informative infographics from structured information from Wikimedia projects - Task A.
In T358412#9603054, @Anachimuco wrote:Hi @Ederporto, I hope you're doing well!
I was going through the project tasks and had a question I wanted to clarify:
When it mentions "the most viewed articles in the Portuguese Wikipedia," is it specifically referring to Brazil, a Portuguese-speaking country, or does it encompass all Portuguese-speaking countries? Just a bit confused about the wording there.
GonzaGertrude added a comment to T358412: Create tool for informative infographics from structured information from Wikimedia projects - Task A.
In T358412#9603799, @GonzaGertrude wrote:Hello @Ederporto,
I can't access the Wikimedia API document. The link in the notebook shows this page.
I shall appreciate your help or that of any member here.
GonzaGertrude added a comment to T358412: Create tool for informative infographics from structured information from Wikimedia projects - Task A.
Hello @Ederporto,
I can't access the Wikimedia API document. The link in the notebook shows this page.
I shall appreciate your help or that of any member here.
Mar 5 2024
Mar 5 2024
Oct 16 2023
Oct 16 2023
In T347405#9244185, @Piracalamina wrote:Hello, @GonzaGertrude
Dynamically fetch curriculum data from the API. for example when a user is viewing Uruguay's curricula and clicks on the link which goes to ghana.html, It would be cool If the API fucntion (query.py) "sensed" this and made a call to Wikidata.
Should I open an issue about that? -> to answer this question, the problem that I see here is that the excecution of bot.py takes quite a long time.I think that the Wikidata query and the excecution of bot.py should be done automatically but scheduled (monthly or weekly or daily). This should give us 2 voci files, one for Uruguay and one for Ghana, which are periodically updated.
Then, when the user clicks on the link, the visualization will switch between ghana's voci file and uruguay's voci file.
What do you think? If it seems sensible, please go ahead and create the issue :D
Oct 9 2023
Oct 9 2023
Hello @Piracalamina @SPatnaik, I realized that my first implementation of Ghana's curriculum wasn't right so I have reimplemented it. I have updated the README with the steps I took. I have also added an html file called ghana.html, which display's Ghana's curriculum. Then, I added an href in index.html to link to ghana.html.
Hi @Piracalamina, I notice this task "Write documentation at the Readme file on how to feed the visualization with data from a new curriculum. The instructions should include building and running the Wikidata query. Remember that, for the time being, only Uruguay's and Ghana's curriculum are structured in Wikidata." has not been done and yet it is ticked. Should I do it?
Oct 7 2023
Oct 7 2023
Hi @Piracalamina, I have implemented and deployed Wikicurricula for Ghana's national curriculum, with reference to the English Wikipedia. Furthermore, I have documented the process in the README.md. Here is a link to my PR https://github.com/wikicurricula-uy/wikicurricula-boilerplate/pull/35
Oct 6 2023
Oct 6 2023
Hello @Piracalamina, I have translated variables and function names from italian to English. Here is the link https://github.com/wikicurricula-uy/wikicurricula-boilerplate/pull/20. In case of any errors, let me know so I can fix them.
Thanks
Hey @282001, please clone this repo https://github.com/wikicurricula-uy/wikicurricula-boilerplate. Then pick a micro task above that has not yet been checked and then do it. You can also check the issues tab in the aforementioned repo, there are open issues.
All the best!
Oct 5 2023
Oct 5 2023
Hello, I am Gonza Gertrude. I would like to contribute to this project. Thanks
Hi and thank you for your interest! Please check thoroughly https://www.mediawiki.org/wiki/New_Developers (and all of its communication section!). The page covers how to get started, assigning tasks, task status, how to find a codebase, how to create patches, where to ask general development questions and where to get help with setup problems, and how to ask good questions. Thanks a lot! :)
Content licensed under Creative Commons Attribution-ShareAlike (CC BY-SA) 4.0 unless otherwise noted; code licensed under GNU General Public License (GPL) 2.0 or later and other open source licenses. By using this site, you agree to the Terms of Use, Privacy Policy, and Code of Conduct. · Wikimedia Foundation · Privacy Policy · Code of Conduct · Terms of Use · Disclaimer · CC-BY-SA · GPL · Credits