whitespace and broken links

This commit is contained in:
Adam 2023-02-05 21:30:53 -05:00
parent 276f269c8e
commit af0683890c
3 changed files with 9 additions and 11 deletions

View file

@ -22,7 +22,7 @@
</p> </p>
<h3>Introducing Scrapey!</h3> <h3>Introducing Scrapey!</h3>
<p> <p>
<a href="projects/reddit/scrapey.html">Scrapey</a> is my scraper script that takes a snapshot <a href="https://doordesk.net/projects/reddit/scrapey.html">Scrapey</a> is my scraper script that takes a snapshot
of Reddit/r/all hot and saves the data to a .csv file including a calculated age for of Reddit/r/all hot and saves the data to a .csv file including a calculated age for
each post about every 12 minutes. Run time is about 2 minutes per iteration and each each post about every 12 minutes. Run time is about 2 minutes per iteration and each
time adds about 100 unique posts to the list while updating any post it's already seen. time adds about 100 unique posts to the list while updating any post it's already seen.
@ -33,7 +33,7 @@
</p> </p>
<h3>EDA</h3> <h3>EDA</h3>
<p> <p>
<a href="projects/reddit/EDA.html">Next I take a quick look to see what looks useful</a>, what <a href="https://doordesk.net/projects/reddit/EDA.html">Next I take a quick look to see what looks useful</a>, what
doesn't, and check for outliers that will throw off the model. There were a few outliers doesn't, and check for outliers that will throw off the model. There were a few outliers
to drop from the num_comments column. to drop from the num_comments column.
</p> </p>
@ -54,7 +54,7 @@
for further processing. for further processing.
</p> </p>
<h3>Clean</h3> <h3>Clean</h3>
<p><a href="projects/reddit/clean.html">Cleaning the data further</a> consists of:</p> <p><a href="https://doordesk.net/projects/reddit/clean.html">Cleaning the data further</a> consists of:</p>
<ul> <ul>
<li>Scaling numeric features between 0-1</li> <li>Scaling numeric features between 0-1</li>
<li>Converting '_' and '-' to whitespace</li> <li>Converting '_' and '-' to whitespace</li>
@ -84,7 +84,7 @@
> >
so I mainly used the forest. so I mainly used the forest.
</p> </p>
<p><a href="projects/reddit/model.html">Notebook Here</a></p> <p><a href="https://doordesk.net/projects/reddit/model.html">Notebook Here</a></p>
<h3>Conclusion</h3> <h3>Conclusion</h3>
<p>Some Predictors from Top 25:</p> <p>Some Predictors from Top 25:</p>
<ul> <ul>

View file

@ -9,10 +9,8 @@
<p> <p>
After finding a number of ways not to begin the project formerly known as my capstone, After finding a number of ways not to begin the project formerly known as my capstone,
I've finally settled on a I've finally settled on a
<a <a href="https://www.kaggle.com/datasets/bwandowando/ukraine-russian-crisis-twitter-dataset-1-2-m-rows">dataset</a>.
href="https://www.kaggle.com/datasets/bwandowando/ukraine-russian-crisis-twitter-dataset-1-2-m-rows" The project is about detecting bots, starting with twitter. I've
>dataset</a
>. The project is about detecting bots, starting with twitter. I've
<a href="https://doordesk.net/projects/bots/docs/debot.pdf">studied</a> a <a href="https://doordesk.net/projects/bots/docs/debot.pdf">studied</a> a
<a href="https://doordesk.net/projects/bots/docs/botwalk.pdf">few</a> <a href="https://doordesk.net/projects/bots/docs/botwalk.pdf">few</a>
<a href="https://doordesk.net/projects/bots/docs/smu.pdf">different</a> <a href="https://doordesk.net/projects/bots/docs/smu.pdf">different</a>