Difference between revisions of "Shrey Agarwal (Work Log)"
Jump to navigation
Jump to search
Line 62: | Line 62: | ||
*Mandatory meeting, then worked through 2 of Ed's unfinished accelerators | *Mandatory meeting, then worked through 2 of Ed's unfinished accelerators | ||
1/23/17 14:00 - 16:00 | 1/23/17 14:00 - 16:00 | ||
− | *Worked with Matthew to go over about 70 items in the accelerator list and ensure that they follow a uniform structure. | + | *Worked with Matthew to go over about 70 items in the accelerator list and ensure that they follow a uniform structure and show correct information |
+ | 1/24/17 14:00 - 16:00 | ||
+ | *Worked with Peter to fix the problem with results not coming through on the new spreadsheet by renaming the file and including more symbols in the searches. Spreadsheet should be up to date now. | ||
+ | *Got to number 144 on the list while going through files. |
Revision as of 17:05, 24 January 2017
09/27/2016 14:00 - 17:00:
- Set up personal and work log pages, accessed Remote Desktop.
- Compiled list of accelerators from Wiki
09/29/2016 14:00 - 16:15; 16:45 - 17:30:
- Created new project: Accelerator Seed List (Data) and worked with Dr. Egan to create schematic for data entry.
- Evaluated 3 sources and logged data. Sources were taken from List of Accelerators. Logged each step onto project page and identified categories that would be suitable for web crawling sometime in the future.
10/11/2016 14:00 - 17:30;
- Explored how to use regular expressions in TextPad to aid with data sorting (need to review expressions with Dr. Egan in future)
- Continued evaluating sources from List of Accelerators and recorded steps onto project page, as before. Finished evaluating the six sources from initial list. (All work done in Accelerator Seed List (Data))
10/13/2016 14:00 - 17:00;
- All work done in Accelerator Seed List (Data)
- Talked to Dr. Egan about project going forward. Need to pick out 10-15 accelerators from the sources listed on my project page and identify a reliable method for obtaining cohort information, as well as other variables
- Used google searches to identify more sources, and evaluated three databases with the help of TextPad
- Began working on more generic google searches. Was able to go through "Location+accelerator"-type searches today. Will continue next time.
10/18/2016 14:00 - 17:30;
- Work continued in Accelerator Seed List (Data)
- Took a sample size of 10 accelerators and detailed how to extract cohort information, as well as what other information is readily available from accelerator URLs.
- Brought Matthew up to speed on accelerator project, added summaries to each section so they became easier to follow, and worked with him to finish up extracting cohort information
10/20/16 14:30 - 17:30:
- Work continued in Accelerator Seed List (Data)
- Finished up the list of instructions for finding the cohort. Continued compiling the list of variables for each of the accelerators within the sample size.
- Consulted Peter on prospects of creating a web crawler with the information we currently have compiled. Determined it was possible, although beyond the scope of Peter's knowledge.
10/25/16 14:00 - 17:00
- Consulted Ed with next step for project.
- Began listing the E-R diagram onto the accelerator database page where entities were potential categories and each entity had its associated attributes
10/27/16 14:00 - 17:00
- Continued working with Matthew to identify elements in the E-R diagram for pulling information on accelerators.
- Found sources to obtain/cross-reference information (ie. Angel List)
11/08/16 14:00 - 18:00
- Identified possible keywords to filter results through for accelerators
- Began compiling a comprehensive list of accelerators based on the data we have already sifted through.
- Learned how to use regular expressions from Ben to sort names individually and alphabetically.
11/10/16 14:00 - 18:00
- Began sorting through accelerator list and removing duplicates, as well as identifying more places to pull names from.
- Worked with Peter to create a crawl for f6s because the website does not return only accelerators.
11/15/16 14:00 - 18:00
- Took a break from f6s to locate more lists based on individual google searches such as "city+accelerator+list"
- Put Seed DB information into an excel file on the remote desktop
11/17/16 14:00 - 16:00
- Continued filling out information for the random Google Searches
- Organized TextPad files on the RDP into coherent excel spreadsheets with proper headers on the table
- Noticed problem with f6s: it seems although all of the html coding was protected by a captcha so the crawler did not actually extract any information; it was all blocked.
11/22/16 14:00 - 17:00
- Worked to fix f6s crawler with Peter
- Finished and compiled master list of accelerators
12/01/16 14:00 - 18:00
- Caught up on project with Ed and Carlin
- Took 20 accelerators (241-260) from the list and filled out text.html files for them; finished the 20
12/05/16 13:00 - 16:00
- After finishing first 20 accelerators, continued working down the list, beginning at 321
- Work noted in Accelerator Seed List (Data), but mostly stored on McNair RDP
12/06/16 14:00 - 18:00
- Continued "Accelerating" down the list in Accelerator Seed List (Data), finished up until 340
12/08/16 14:00 - 17:00
- Continued working on accelerator list on the same page.
01/17/17 14:00 - 16:00
- Finished up "accelerating" from Accelerator Seed List (Data), numbers 341-351
1/18/17 14:00 - 16:00
- Finished accelerating for sure, went back and began an overview of the work done for quality control.
01/20/17 14:00 - 16:00
- Mandatory meeting, then worked through 2 of Ed's unfinished accelerators
1/23/17 14:00 - 16:00
- Worked with Matthew to go over about 70 items in the accelerator list and ensure that they follow a uniform structure and show correct information
1/24/17 14:00 - 16:00
- Worked with Peter to fix the problem with results not coming through on the new spreadsheet by renaming the file and including more symbols in the searches. Spreadsheet should be up to date now.
- Got to number 144 on the list while going through files.