Difference between revisions of "Peter Jalbert"

From edegan.com
Jump to navigation Jump to search
 
(39 intermediate revisions by 2 users not shown)
Line 1: Line 1:
{{McNair Staff
+
{{Team Member
|position=Tech Team
+
|Has team sponsor=McNair Center
|name=Peter Jalbert,
+
|Went to school=Rice
|user_image=peter_headshot.jpg
+
|Has team position=Student
|degree=BA
+
|Has job title=Tech Team
|major=Computer Science; Music Performance
+
|Has name=Peter Jalbert,
 +
|Has headshot=peter_headshot.jpg
 +
|Has or doing degree=Bachelor
 +
|Has academic major=Computer Science; Music Performance
 
|class=2019,
 
|class=2019,
 
|join_date=09/27/2016,
 
|join_date=09/27/2016,
|skills=Python, C, Selenium,
+
|Has skills=Python, Selenium, Javascript, Java, SQL,
 
|interests=Music, Movies, Travel,
 
|interests=Music, Movies, Travel,
|email=pwj1@rice.edu
+
|Has email=pwj1@rice.edu
|status=Active,
+
|Has team status=Active
 
}}
 
}}
 
==Early Life==
 
Peter was born in Philadelphia, PA to Julia and Pierre Jalbert in the middle of his father's interview for a position at Rice University. Days later, the Jalbert family moved to Houston, TX, where they still reside to this day.
 
 
 
==Education==
 
==Education==
Peter is currently a junior at Rice University, pursuing a dual degree in Computer Science and Clarinet Performance. Peter graduated Salutatorian from the High School for the Performing and Visual Arts in Houston, TX in 2014.  
+
Peter is currently a junior at Rice University, pursuing a double major in Computer Science and Music. Peter graduated Salutatorian from the High School for the Performing and Visual Arts in Houston, TX in 2014.  
  
 
==Contributing Projects==
 
==Contributing Projects==
 +
*[http://mcnair.bakerinstitute.org/wiki/Selenium_Documentation Selenium Documentation]
 +
*[http://mcnair.bakerinstitute.org/wiki/Demo_Day_Page_Parser Demo Day Crawler]
 +
*[http://mcnair.bakerinstitute.org/wiki/Tiger_Geocoder#State_Data Tiger Geocoder]
 +
*[http://mcnair.bakerinstitute.org/wiki/Houston_Innovation_District Houston Innovation District]
 +
* [http://mcnair.bakerinstitute.org/wiki/Crunchbase_Data Crunchbase Data]
 +
* [http://mcnair.bakerinstitute.org/wiki/LinkedIn_Crawler_(Python) LinkedIn Crawler]
 +
* [http://mcnair.bakerinstitute.org/wiki/Urban_Start-up_Agglomeration Urban Start-up Agglomeration]
 +
* [http://mcnair.bakerinstitute.org/wiki/Enclosing_Circle_Algorithm Enclosing Circle Algorithm]
 +
* [http://mcnair.bakerinstitute.org/wiki/Top_Cities_for_VC_Backed_Companies Top Cities for VC Backed Companies]
 
* [http://mcnair.bakerinstitute.org/wiki/Moroccan_Parliament_Web_Crawler Moroccan Parliament Web Crawler]
 
* [http://mcnair.bakerinstitute.org/wiki/Moroccan_Parliament_Web_Crawler Moroccan Parliament Web Crawler]
 
* [http://mcnair.bakerinstitute.org/wiki/E%26I_Governance_Policy_Report E&I Governance Policy Report]
 
* [http://mcnair.bakerinstitute.org/wiki/E%26I_Governance_Policy_Report E&I Governance Policy Report]
Line 27: Line 35:
  
 
==Looking for Code?==
 
==Looking for Code?==
 +
 +
===Demo Day Crawler===
 +
Term: Fall 2017
 +
 +
Crawls Google search to find candidate web pages for accelerator companies' demo days.
 +
E:\McNair\Software\Accelerators\DemoDayCrawler.py
 +
 +
===Demo Day Hits===
 +
Term: Fall 2017
 +
 +
Analyzes the results of a demo day crawl for hits of keywords.
 +
E:\McNair\Software\Accelerators\DemoDayHits.py
 +
 +
===HTML to Text===
 +
Term: Fall 2017
 +
 +
Converts a folder of HTML files to a folder of TXT files.
 +
E:\McNair\Software\Accelerators\htmlToText.py
 +
 +
===Tiger Geocoder===
 +
Term: Fall 2017
 +
 +
Installed a psql extension that allows for internal geocoding of addresses.
 +
psql geocoder
 +
 +
===Yelp Crawler===
 +
Term: Fall 2017
 +
 +
Usage: Crawls for data on restaurants and coffeeshops within the 610 Loop. Part of the Houston Innovation District Project.
 +
E:\McNair\Software\YelpCrawler\yelp_crawl.py
 +
 +
===Accelerator Founders===
 +
Term: Fall 2017
 +
 +
Usage: Uses the LinkedIn Crawler along with the Crunchbase founders data to retrieve information on accelerator founders.
 +
E:\McNair\Projects\LinkedIn Crawler\LinkedIn_Crawler\linkedin\linkedin_founders.py
 +
 +
=== Crunchbase Founders ===
 +
Term: Fall 2017
 +
 +
Usage: Queries the Crunchbase API to get names of accelerator founders.
 +
E:\McNair\Projects\Accelerators\crunchbase_founders.py
 +
 +
===LinkedIn Crawler===
 +
Term: Spring 2017
 +
 +
Usage: Crawls LinkedIn to obtain relevant information.
 +
E:\McNair\Projects\LinkedIn Crawler\web_crawler\linkedin\run_linkedin_recruiter.py
 +
 +
===Draw Enclosing Circles===
 +
Term: Spring 2017
 +
 +
Usage: Draws the outcome of the Enclosing Circle Algorithm on a particular city to a google map HTML output.
 +
E:\McNair\Projects\Accelerators\Enclosing_Circle\draw_vc_circles.py
 +
 +
===Enclosing Circle for VCs ===
 +
Term: Spring 2017
 +
 +
Usage: Uses the Enclosing Circle algorithm to find concentrations of VCs.
 +
E:\McNair\Projects\Accelerators\Enclosing_Circle\vc_circles.py
 +
 +
=== Industry Classifier ===
 +
Term: Spring 2017
 +
 +
Usage: Neural Net that predicts a companies industry classification.
 +
E:\McNair\Projects\Accelerators\Code+Final_Data\ChristyCode\IndustryClassifier.py
 +
 +
===WayBack Machine Parser===
 +
Term: Spring 2017
 +
 +
Usage: Uses the WayBack Machine API to retrieve timestamps for URLs.
 +
E:\McNair\Projects\Accelerators\Spring 2017\Code+Final_Data\wayback_machine.py
 +
 +
===Accelerator Address Geolocation===
 +
Term: Spring 2017
 +
 +
Usage: Used to find latitude and longitude points for all accelerators in the accelerator data files.
 +
E:\McNair\Projects\Accelerators\Code+Final_Data\process_locations.py
  
 
===Accelerator Data Parser===
 
===Accelerator Data Parser===
Term: Spring 2017\n
+
Term: Spring 2017
 +
 
 
Usage: Used to parse the data for the [http://mcnair.bakerinstitute.org/wiki/Accelerator_Seed_List_(Data) Accelerator Seed List Accelerator Seed List Project].
 
Usage: Used to parse the data for the [http://mcnair.bakerinstitute.org/wiki/Accelerator_Seed_List_(Data) Accelerator Seed List Accelerator Seed List Project].
 
  E:\McNair\Projects\Accelerators\Code+Final_Data\parse_accelerator_data.py
 
  E:\McNair\Projects\Accelerators\Code+Final_Data\parse_accelerator_data.py
 +
 +
===Cohort Data Parser===
 +
Term: Spring 2017
 +
 +
Usage: Used to parse cohort data for the [http://mcnair.bakerinstitute.org/wiki/Accelerator_Seed_List_(Data) Accelerator Seed List Accelerator Seed List Project].
 +
E:\McNair\Projects\Accelerators\Code+Final_Data\parse_cohort_data.py
 +
 +
===Google SiteSearch===
 +
Term: Spring 2017
 +
 +
Usage: Preliminary stage project intended to find an accurate web site for an unlisted company web address by using Google Search.
 +
 +
E:\McNair\Projects\Accelerators\Google_SiteSearch\sitesearch.py
 +
 +
===F6S Crawler===
 +
Term: Fall 2016
 +
 +
Usage: Used to download html files containing accelerator information from the F6S website.
 +
 +
E:\McNair\Projects\Accelerators\F6S Accelerator HTMLs\F6S_Crawler\f6s_crawler_gentle.py
 +
 +
===F6S Parser===
 +
Term: Fall 2016
 +
 +
Usage: Used to parse the html files downloaded by the F6S crawler to create a list of accelerators.
 +
 +
E:\McNair\Projects\Accelerators\F6S Accelerator HTMLs\F6S_Crawler\f6s_parser.py
 +
 +
===Executive Order Crawler===
 +
Term: Fall 2016
 +
 +
Usage: Used to download executive orders. NOTE: uses scrapy format, run differently from regular python programs.
 +
 +
E:\McNair\Projects\Executive_order_crawler\executive
 +
 +
 +
===Kuwait Web Driver===
 +
Term: Fall 2016
 +
 +
Usage: Used to download csvs of bills and questions from the Kuwait Government Website. Uses Selenium. All scripts in the folder do similar things.
 +
 +
E:\McNair\Projects\Middle East Studies Web Drivers\Kuwait
 +
 +
===Moroccan Web Driver===
 +
Term: Fall 2016
 +
 +
Usage: Used to download pdfs of bills and questions from the Moroccan Government Website. Uses Selenium. All scripts in the folder do similar things.
 +
 +
E:\McNair\Projects\Middle East Studies Web Drivers\Morocco\Moroccan Bills
 +
  
 
==Time at McNair==
 
==Time at McNair==
 
[[Peter Jalbert (Work Log)]]
 
[[Peter Jalbert (Work Log)]]
 
[[Category:McNair Staff]]
 
[[Category:McNair Staff]]

Latest revision as of 10:52, 6 October 2020

Team Member
Student
Peter Jalbert,
Peter headshot.jpg
Information
Status Active
Degree Bachelor
Major Computer Science; Music Performance
Skills (Students only) Python, Selenium, Javascript, Java, SQL
Email pwj1@rice.edu
Projects Demo Day Page Parser, Enclosing Circle Algorithm, Moroccan Parliament Web Crawler, Python Libraries, Selenium Documentation, Tiger Geocoder, Top Cities for VC Backed Companies
School Rice
Job Title Tech Team
Sponsor McNair Center
Copyright © 2019 edegan.com. All Rights Reserved.

Education

Peter is currently a junior at Rice University, pursuing a double major in Computer Science and Music. Peter graduated Salutatorian from the High School for the Performing and Visual Arts in Houston, TX in 2014.

Contributing Projects

Looking for Code?

Demo Day Crawler

Term: Fall 2017

Crawls Google search to find candidate web pages for accelerator companies' demo days.

E:\McNair\Software\Accelerators\DemoDayCrawler.py

Demo Day Hits

Term: Fall 2017

Analyzes the results of a demo day crawl for hits of keywords.

E:\McNair\Software\Accelerators\DemoDayHits.py

HTML to Text

Term: Fall 2017

Converts a folder of HTML files to a folder of TXT files.

E:\McNair\Software\Accelerators\htmlToText.py

Tiger Geocoder

Term: Fall 2017

Installed a psql extension that allows for internal geocoding of addresses.

psql geocoder

Yelp Crawler

Term: Fall 2017

Usage: Crawls for data on restaurants and coffeeshops within the 610 Loop. Part of the Houston Innovation District Project.

E:\McNair\Software\YelpCrawler\yelp_crawl.py

Accelerator Founders

Term: Fall 2017

Usage: Uses the LinkedIn Crawler along with the Crunchbase founders data to retrieve information on accelerator founders.

E:\McNair\Projects\LinkedIn Crawler\LinkedIn_Crawler\linkedin\linkedin_founders.py

Crunchbase Founders

Term: Fall 2017

Usage: Queries the Crunchbase API to get names of accelerator founders.

E:\McNair\Projects\Accelerators\crunchbase_founders.py

LinkedIn Crawler

Term: Spring 2017

Usage: Crawls LinkedIn to obtain relevant information.

E:\McNair\Projects\LinkedIn Crawler\web_crawler\linkedin\run_linkedin_recruiter.py

Draw Enclosing Circles

Term: Spring 2017

Usage: Draws the outcome of the Enclosing Circle Algorithm on a particular city to a google map HTML output.

E:\McNair\Projects\Accelerators\Enclosing_Circle\draw_vc_circles.py

Enclosing Circle for VCs

Term: Spring 2017

Usage: Uses the Enclosing Circle algorithm to find concentrations of VCs.

E:\McNair\Projects\Accelerators\Enclosing_Circle\vc_circles.py

Industry Classifier

Term: Spring 2017

Usage: Neural Net that predicts a companies industry classification.

E:\McNair\Projects\Accelerators\Code+Final_Data\ChristyCode\IndustryClassifier.py

WayBack Machine Parser

Term: Spring 2017

Usage: Uses the WayBack Machine API to retrieve timestamps for URLs.

E:\McNair\Projects\Accelerators\Spring 2017\Code+Final_Data\wayback_machine.py

Accelerator Address Geolocation

Term: Spring 2017

Usage: Used to find latitude and longitude points for all accelerators in the accelerator data files.

E:\McNair\Projects\Accelerators\Code+Final_Data\process_locations.py

Accelerator Data Parser

Term: Spring 2017

Usage: Used to parse the data for the Accelerator Seed List Accelerator Seed List Project.

E:\McNair\Projects\Accelerators\Code+Final_Data\parse_accelerator_data.py

Cohort Data Parser

Term: Spring 2017

Usage: Used to parse cohort data for the Accelerator Seed List Accelerator Seed List Project.

E:\McNair\Projects\Accelerators\Code+Final_Data\parse_cohort_data.py

Google SiteSearch

Term: Spring 2017

Usage: Preliminary stage project intended to find an accurate web site for an unlisted company web address by using Google Search.

E:\McNair\Projects\Accelerators\Google_SiteSearch\sitesearch.py

F6S Crawler

Term: Fall 2016

Usage: Used to download html files containing accelerator information from the F6S website.

E:\McNair\Projects\Accelerators\F6S Accelerator HTMLs\F6S_Crawler\f6s_crawler_gentle.py

F6S Parser

Term: Fall 2016

Usage: Used to parse the html files downloaded by the F6S crawler to create a list of accelerators.

E:\McNair\Projects\Accelerators\F6S Accelerator HTMLs\F6S_Crawler\f6s_parser.py

Executive Order Crawler

Term: Fall 2016

Usage: Used to download executive orders. NOTE: uses scrapy format, run differently from regular python programs.

E:\McNair\Projects\Executive_order_crawler\executive


Kuwait Web Driver

Term: Fall 2016

Usage: Used to download csvs of bills and questions from the Kuwait Government Website. Uses Selenium. All scripts in the folder do similar things.

E:\McNair\Projects\Middle East Studies Web Drivers\Kuwait

Moroccan Web Driver

Term: Fall 2016

Usage: Used to download pdfs of bills and questions from the Moroccan Government Website. Uses Selenium. All scripts in the folder do similar things.

E:\McNair\Projects\Middle East Studies Web Drivers\Morocco\Moroccan Bills


Time at McNair

Peter Jalbert (Work Log)