Difference between revisions of "Peter Jalbert"
Peterjalbert (talk | contribs) |
|||
(69 intermediate revisions by 3 users not shown) | |||
Line 1: | Line 1: | ||
− | {{McNair | + | {{Team Member |
− | |position= | + | |Has team sponsor=McNair Center |
− | |name=Peter Jalbert, | + | |Went to school=Rice |
− | |degree= | + | |Has team position=Student |
− | |major=Computer Science | + | |Has job title=Tech Team |
+ | |Has name=Peter Jalbert, | ||
+ | |Has headshot=peter_headshot.jpg | ||
+ | |Has or doing degree=Bachelor | ||
+ | |Has academic major=Computer Science; Music Performance | ||
|class=2019, | |class=2019, | ||
|join_date=09/27/2016, | |join_date=09/27/2016, | ||
− | |skills=Python, | + | |Has skills=Python, Selenium, Javascript, Java, SQL, |
− | |interests=Music, | + | |interests=Music, Movies, Travel, |
− | |email=pwj1@rice.edu | + | |Has email=pwj1@rice.edu |
− | |status=Active | + | |Has team status=Active |
}} | }} | ||
+ | ==Education== | ||
+ | Peter is currently a junior at Rice University, pursuing a double major in Computer Science and Music. Peter graduated Salutatorian from the High School for the Performing and Visual Arts in Houston, TX in 2014. | ||
+ | ==Contributing Projects== | ||
+ | *[http://mcnair.bakerinstitute.org/wiki/Selenium_Documentation Selenium Documentation] | ||
+ | *[http://mcnair.bakerinstitute.org/wiki/Demo_Day_Page_Parser Demo Day Crawler] | ||
+ | *[http://mcnair.bakerinstitute.org/wiki/Tiger_Geocoder#State_Data Tiger Geocoder] | ||
+ | *[http://mcnair.bakerinstitute.org/wiki/Houston_Innovation_District Houston Innovation District] | ||
+ | * [http://mcnair.bakerinstitute.org/wiki/Crunchbase_Data Crunchbase Data] | ||
+ | * [http://mcnair.bakerinstitute.org/wiki/LinkedIn_Crawler_(Python) LinkedIn Crawler] | ||
+ | * [http://mcnair.bakerinstitute.org/wiki/Urban_Start-up_Agglomeration Urban Start-up Agglomeration] | ||
+ | * [http://mcnair.bakerinstitute.org/wiki/Enclosing_Circle_Algorithm Enclosing Circle Algorithm] | ||
+ | * [http://mcnair.bakerinstitute.org/wiki/Top_Cities_for_VC_Backed_Companies Top Cities for VC Backed Companies] | ||
+ | * [http://mcnair.bakerinstitute.org/wiki/Moroccan_Parliament_Web_Crawler Moroccan Parliament Web Crawler] | ||
+ | * [http://mcnair.bakerinstitute.org/wiki/E%26I_Governance_Policy_Report E&I Governance Policy Report] | ||
+ | * [http://mcnair.bakerinstitute.org/wiki/Govtrack_Webcrawler_(Wiki_Page) GovTrack Web Crawler] | ||
+ | * [http://mcnair.bakerinstitute.org/wiki/Accelerator_Seed_List_(Data) Accelerator Seed List] | ||
+ | * [http://mcnair.bakerinstitute.org/wiki/Google_Scholar_Crawler Google Scholar Crawler] | ||
+ | |||
+ | ==Looking for Code?== | ||
+ | |||
+ | ===Demo Day Crawler=== | ||
+ | Term: Fall 2017 | ||
+ | |||
+ | Crawls Google search to find candidate web pages for accelerator companies' demo days. | ||
+ | E:\McNair\Software\Accelerators\DemoDayCrawler.py | ||
+ | |||
+ | ===Demo Day Hits=== | ||
+ | Term: Fall 2017 | ||
+ | |||
+ | Analyzes the results of a demo day crawl for hits of keywords. | ||
+ | E:\McNair\Software\Accelerators\DemoDayHits.py | ||
+ | |||
+ | ===HTML to Text=== | ||
+ | Term: Fall 2017 | ||
+ | |||
+ | Converts a folder of HTML files to a folder of TXT files. | ||
+ | E:\McNair\Software\Accelerators\htmlToText.py | ||
+ | |||
+ | ===Tiger Geocoder=== | ||
+ | Term: Fall 2017 | ||
+ | |||
+ | Installed a psql extension that allows for internal geocoding of addresses. | ||
+ | psql geocoder | ||
+ | |||
+ | ===Yelp Crawler=== | ||
+ | Term: Fall 2017 | ||
+ | |||
+ | Usage: Crawls for data on restaurants and coffeeshops within the 610 Loop. Part of the Houston Innovation District Project. | ||
+ | E:\McNair\Software\YelpCrawler\yelp_crawl.py | ||
+ | |||
+ | ===Accelerator Founders=== | ||
+ | Term: Fall 2017 | ||
+ | |||
+ | Usage: Uses the LinkedIn Crawler along with the Crunchbase founders data to retrieve information on accelerator founders. | ||
+ | E:\McNair\Projects\LinkedIn Crawler\LinkedIn_Crawler\linkedin\linkedin_founders.py | ||
+ | |||
+ | === Crunchbase Founders === | ||
+ | Term: Fall 2017 | ||
+ | |||
+ | Usage: Queries the Crunchbase API to get names of accelerator founders. | ||
+ | E:\McNair\Projects\Accelerators\crunchbase_founders.py | ||
+ | |||
+ | ===LinkedIn Crawler=== | ||
+ | Term: Spring 2017 | ||
+ | |||
+ | Usage: Crawls LinkedIn to obtain relevant information. | ||
+ | E:\McNair\Projects\LinkedIn Crawler\web_crawler\linkedin\run_linkedin_recruiter.py | ||
+ | |||
+ | ===Draw Enclosing Circles=== | ||
+ | Term: Spring 2017 | ||
+ | |||
+ | Usage: Draws the outcome of the Enclosing Circle Algorithm on a particular city to a google map HTML output. | ||
+ | E:\McNair\Projects\Accelerators\Enclosing_Circle\draw_vc_circles.py | ||
+ | |||
+ | ===Enclosing Circle for VCs === | ||
+ | Term: Spring 2017 | ||
+ | |||
+ | Usage: Uses the Enclosing Circle algorithm to find concentrations of VCs. | ||
+ | E:\McNair\Projects\Accelerators\Enclosing_Circle\vc_circles.py | ||
+ | |||
+ | === Industry Classifier === | ||
+ | Term: Spring 2017 | ||
+ | |||
+ | Usage: Neural Net that predicts a companies industry classification. | ||
+ | E:\McNair\Projects\Accelerators\Code+Final_Data\ChristyCode\IndustryClassifier.py | ||
+ | |||
+ | ===WayBack Machine Parser=== | ||
+ | Term: Spring 2017 | ||
+ | |||
+ | Usage: Uses the WayBack Machine API to retrieve timestamps for URLs. | ||
+ | E:\McNair\Projects\Accelerators\Spring 2017\Code+Final_Data\wayback_machine.py | ||
+ | |||
+ | ===Accelerator Address Geolocation=== | ||
+ | Term: Spring 2017 | ||
+ | |||
+ | Usage: Used to find latitude and longitude points for all accelerators in the accelerator data files. | ||
+ | E:\McNair\Projects\Accelerators\Code+Final_Data\process_locations.py | ||
+ | |||
+ | ===Accelerator Data Parser=== | ||
+ | Term: Spring 2017 | ||
+ | |||
+ | Usage: Used to parse the data for the [http://mcnair.bakerinstitute.org/wiki/Accelerator_Seed_List_(Data) Accelerator Seed List Accelerator Seed List Project]. | ||
+ | E:\McNair\Projects\Accelerators\Code+Final_Data\parse_accelerator_data.py | ||
+ | |||
+ | ===Cohort Data Parser=== | ||
+ | Term: Spring 2017 | ||
+ | |||
+ | Usage: Used to parse cohort data for the [http://mcnair.bakerinstitute.org/wiki/Accelerator_Seed_List_(Data) Accelerator Seed List Accelerator Seed List Project]. | ||
+ | E:\McNair\Projects\Accelerators\Code+Final_Data\parse_cohort_data.py | ||
+ | |||
+ | ===Google SiteSearch=== | ||
+ | Term: Spring 2017 | ||
+ | |||
+ | Usage: Preliminary stage project intended to find an accurate web site for an unlisted company web address by using Google Search. | ||
+ | |||
+ | E:\McNair\Projects\Accelerators\Google_SiteSearch\sitesearch.py | ||
+ | |||
+ | ===F6S Crawler=== | ||
+ | Term: Fall 2016 | ||
+ | |||
+ | Usage: Used to download html files containing accelerator information from the F6S website. | ||
+ | |||
+ | E:\McNair\Projects\Accelerators\F6S Accelerator HTMLs\F6S_Crawler\f6s_crawler_gentle.py | ||
+ | |||
+ | ===F6S Parser=== | ||
+ | Term: Fall 2016 | ||
+ | |||
+ | Usage: Used to parse the html files downloaded by the F6S crawler to create a list of accelerators. | ||
+ | |||
+ | E:\McNair\Projects\Accelerators\F6S Accelerator HTMLs\F6S_Crawler\f6s_parser.py | ||
+ | |||
+ | ===Executive Order Crawler=== | ||
+ | Term: Fall 2016 | ||
+ | |||
+ | Usage: Used to download executive orders. NOTE: uses scrapy format, run differently from regular python programs. | ||
+ | |||
+ | E:\McNair\Projects\Executive_order_crawler\executive | ||
+ | |||
+ | |||
+ | ===Kuwait Web Driver=== | ||
+ | Term: Fall 2016 | ||
+ | |||
+ | Usage: Used to download csvs of bills and questions from the Kuwait Government Website. Uses Selenium. All scripts in the folder do similar things. | ||
+ | |||
+ | E:\McNair\Projects\Middle East Studies Web Drivers\Kuwait | ||
+ | |||
+ | ===Moroccan Web Driver=== | ||
+ | Term: Fall 2016 | ||
+ | |||
+ | Usage: Used to download pdfs of bills and questions from the Moroccan Government Website. Uses Selenium. All scripts in the folder do similar things. | ||
+ | |||
+ | E:\McNair\Projects\Middle East Studies Web Drivers\Morocco\Moroccan Bills | ||
+ | |||
+ | |||
+ | ==Time at McNair== | ||
[[Peter Jalbert (Work Log)]] | [[Peter Jalbert (Work Log)]] | ||
− | + | [[Category:McNair Staff]] | |
− |
Latest revision as of 10:52, 6 October 2020
Student | |
---|---|
Peter Jalbert, | |
Information | |
Status | Active |
Degree | Bachelor |
Major | Computer Science; Music Performance |
Skills (Students only) | Python, Selenium, Javascript, Java, SQL |
pwj1@rice.edu | |
Projects | Demo Day Page Parser, Enclosing Circle Algorithm, Moroccan Parliament Web Crawler, Python Libraries, Selenium Documentation, Tiger Geocoder, Top Cities for VC Backed Companies |
School | Rice |
Job Title | Tech Team |
Sponsor | McNair Center |
Copyright © 2019 edegan.com. All Rights Reserved. |
Contents
- 1 Education
- 2 Contributing Projects
- 3 Looking for Code?
- 3.1 Demo Day Crawler
- 3.2 Demo Day Hits
- 3.3 HTML to Text
- 3.4 Tiger Geocoder
- 3.5 Yelp Crawler
- 3.6 Accelerator Founders
- 3.7 Crunchbase Founders
- 3.8 LinkedIn Crawler
- 3.9 Draw Enclosing Circles
- 3.10 Enclosing Circle for VCs
- 3.11 Industry Classifier
- 3.12 WayBack Machine Parser
- 3.13 Accelerator Address Geolocation
- 3.14 Accelerator Data Parser
- 3.15 Cohort Data Parser
- 3.16 Google SiteSearch
- 3.17 F6S Crawler
- 3.18 F6S Parser
- 3.19 Executive Order Crawler
- 3.20 Kuwait Web Driver
- 3.21 Moroccan Web Driver
- 4 Time at McNair
Education
Peter is currently a junior at Rice University, pursuing a double major in Computer Science and Music. Peter graduated Salutatorian from the High School for the Performing and Visual Arts in Houston, TX in 2014.
Contributing Projects
- Selenium Documentation
- Demo Day Crawler
- Tiger Geocoder
- Houston Innovation District
- Crunchbase Data
- LinkedIn Crawler
- Urban Start-up Agglomeration
- Enclosing Circle Algorithm
- Top Cities for VC Backed Companies
- Moroccan Parliament Web Crawler
- E&I Governance Policy Report
- GovTrack Web Crawler
- Accelerator Seed List
- Google Scholar Crawler
Looking for Code?
Demo Day Crawler
Term: Fall 2017
Crawls Google search to find candidate web pages for accelerator companies' demo days.
E:\McNair\Software\Accelerators\DemoDayCrawler.py
Demo Day Hits
Term: Fall 2017
Analyzes the results of a demo day crawl for hits of keywords.
E:\McNair\Software\Accelerators\DemoDayHits.py
HTML to Text
Term: Fall 2017
Converts a folder of HTML files to a folder of TXT files.
E:\McNair\Software\Accelerators\htmlToText.py
Tiger Geocoder
Term: Fall 2017
Installed a psql extension that allows for internal geocoding of addresses.
psql geocoder
Yelp Crawler
Term: Fall 2017
Usage: Crawls for data on restaurants and coffeeshops within the 610 Loop. Part of the Houston Innovation District Project.
E:\McNair\Software\YelpCrawler\yelp_crawl.py
Accelerator Founders
Term: Fall 2017
Usage: Uses the LinkedIn Crawler along with the Crunchbase founders data to retrieve information on accelerator founders.
E:\McNair\Projects\LinkedIn Crawler\LinkedIn_Crawler\linkedin\linkedin_founders.py
Crunchbase Founders
Term: Fall 2017
Usage: Queries the Crunchbase API to get names of accelerator founders.
E:\McNair\Projects\Accelerators\crunchbase_founders.py
LinkedIn Crawler
Term: Spring 2017
Usage: Crawls LinkedIn to obtain relevant information.
E:\McNair\Projects\LinkedIn Crawler\web_crawler\linkedin\run_linkedin_recruiter.py
Draw Enclosing Circles
Term: Spring 2017
Usage: Draws the outcome of the Enclosing Circle Algorithm on a particular city to a google map HTML output.
E:\McNair\Projects\Accelerators\Enclosing_Circle\draw_vc_circles.py
Enclosing Circle for VCs
Term: Spring 2017
Usage: Uses the Enclosing Circle algorithm to find concentrations of VCs.
E:\McNair\Projects\Accelerators\Enclosing_Circle\vc_circles.py
Industry Classifier
Term: Spring 2017
Usage: Neural Net that predicts a companies industry classification.
E:\McNair\Projects\Accelerators\Code+Final_Data\ChristyCode\IndustryClassifier.py
WayBack Machine Parser
Term: Spring 2017
Usage: Uses the WayBack Machine API to retrieve timestamps for URLs.
E:\McNair\Projects\Accelerators\Spring 2017\Code+Final_Data\wayback_machine.py
Accelerator Address Geolocation
Term: Spring 2017
Usage: Used to find latitude and longitude points for all accelerators in the accelerator data files.
E:\McNair\Projects\Accelerators\Code+Final_Data\process_locations.py
Accelerator Data Parser
Term: Spring 2017
Usage: Used to parse the data for the Accelerator Seed List Accelerator Seed List Project.
E:\McNair\Projects\Accelerators\Code+Final_Data\parse_accelerator_data.py
Cohort Data Parser
Term: Spring 2017
Usage: Used to parse cohort data for the Accelerator Seed List Accelerator Seed List Project.
E:\McNair\Projects\Accelerators\Code+Final_Data\parse_cohort_data.py
Google SiteSearch
Term: Spring 2017
Usage: Preliminary stage project intended to find an accurate web site for an unlisted company web address by using Google Search.
E:\McNair\Projects\Accelerators\Google_SiteSearch\sitesearch.py
F6S Crawler
Term: Fall 2016
Usage: Used to download html files containing accelerator information from the F6S website.
E:\McNair\Projects\Accelerators\F6S Accelerator HTMLs\F6S_Crawler\f6s_crawler_gentle.py
F6S Parser
Term: Fall 2016
Usage: Used to parse the html files downloaded by the F6S crawler to create a list of accelerators.
E:\McNair\Projects\Accelerators\F6S Accelerator HTMLs\F6S_Crawler\f6s_parser.py
Executive Order Crawler
Term: Fall 2016
Usage: Used to download executive orders. NOTE: uses scrapy format, run differently from regular python programs.
E:\McNair\Projects\Executive_order_crawler\executive
Kuwait Web Driver
Term: Fall 2016
Usage: Used to download csvs of bills and questions from the Kuwait Government Website. Uses Selenium. All scripts in the folder do similar things.
E:\McNair\Projects\Middle East Studies Web Drivers\Kuwait
Moroccan Web Driver
Term: Fall 2016
Usage: Used to download pdfs of bills and questions from the Moroccan Government Website. Uses Selenium. All scripts in the folder do similar things.
E:\McNair\Projects\Middle East Studies Web Drivers\Morocco\Moroccan Bills