Fri, Feb 21
I have a small test cube on dev_analytics pulling down new donation IDs, contact IDs, and utm_medium on a 10 minute interval via a python3 script in my home directory. I will alter the time table to a 1 hour interval to run overnight (through Saturday, possibly the weekend) for additional timing tests, and pick this up again Monday for more full-scale trial of larger data sets.
Mon, Feb 10
Thu, Feb 6
I'm curious, @Ottomata and @JAllemandou, if there is an elegant solution to dynamically filling partitions. IE, once a table is created with the partition types declared and a main location established, is the best way to fill the table to declare individual ALTER statements by day, hour, etc? Or is there a better way to accomplish this?
Thanks for the suggestion, @Ottomata - I'll take it back to the team and see what makes sense. Since we've been using the json_string format since 2016, it might make sense to have the 2019 data in the same format as well.
Thank you, @JAllemandou! I am new to Hive, and was not aware I could do this myself. It worked well, though, and I appreciate the resources. You can close this ticket; much appreciated.
Thank you, @JAllemandou! I've learned several new Hive features and commands.
Tue, Feb 4
Jan 24 2020
Jan 9 2020
Thank you Eileen!
Jan 7 2020
Hi @Eileenmcnaughton, could you please backfill data again? That would help me do an en6C wrap up while a fix is being worked on. Thanks!
Dec 17 2019
This is great; thank you!
@Jgreen that's a good question. What is the usual time frame for a full restore and replace process (worst case scenario)? If we successfully switch over reporting to an analytics server and rely exclusively on it for all reporting during peak times like Big English, that could be a hindrance to a number of work flows.
Dec 11 2019
Great! Thank you, @Eileenmcnaughton!
Hi @Eileenmcnaughton, do you think I can start working with this data for current (FY1920) mailings, or are things still being investigated?
Nov 27 2019
Nov 21 2019
I would also like to request Kerberos credentials for stat100x and notebook100x machines. My username is eyener. Thank you!
To clarify the expected downtime: Is it 45 minutes or 3 to 4 hours?
Nov 15 2019
@Jgreen can you install SQLAlchemy as well, please?
Oct 31 2019
Hi Eileen - I completely agree. I would prefer the data coming in to be as raw as possible so that we're future-proofed against any changes we want to make in terms of how we want to analyze the data. So if the donor opens the email multiple times, multiple opens are great to see. That second ask is more to just confirm that (which I think we're both saying) that "what the user does, the data reflects" without any wrangling, cleaning, or summarizing at this stage.
Oct 29 2019
Oct 28 2019
Oct 15 2019
Hi @Nuria is there anything I need to do on my end at this point, or should I wait for the mentioned merge to happen?
Oct 14 2019
Hi @Nuria and team, is there anything needed from my end on this ticket?
Oct 10 2019
Hi @herron thanks for the ping. Please let me know if this is what you need:
Oct 3 2019
Oct 1 2019
Thank you! I also have access to Turnilo. I have two follow-up questions:
Sep 30 2019
Hi @Nuria I do have an LDAP account: eyener-ctr
I'm not certain if/what groups I belong to, however.
Hi @Nuria I have created a Wikitech account. My username is my full name, Erin Yener. Please let me know if you need anything else.
Sep 23 2019
Hi @Dwisehaupt and thanks for your help today. I can now access the frdev1001 db via ssh shell and wanted to follow up on 2 remaining questions:
After speaking with @jrobell, I will also need access to:
Sep 22 2019
Thanks for the instructions - I'm encountering some issues and can't identify which step in the process I'm missing/mis-applying. Would you be able to jump on a screen share at some point tomorrow (Monday) to assist?
Sep 20 2019
Thank you very much @Ejegg - I have logged in now. It looks like I am able to view reports and navigate around the Civi UI. How can I query the Civi database?
What credentials do I use to log into Civi (username/password). My ldap credentials don't seem to be correct. Please let me know who I would ask for this information.
@Dwisehaupt here is the public SSH key: