{"id":34702,"date":"2025-07-15T23:27:05","date_gmt":"2025-07-15T21:27:05","guid":{"rendered":"https:\/\/www.graviton.at\/letterswaplibrary\/thoughts-on-this-data-cleaning-project\/"},"modified":"2025-07-15T23:27:05","modified_gmt":"2025-07-15T21:27:05","slug":"thoughts-on-this-data-cleaning-project","status":"publish","type":"post","link":"https:\/\/www.graviton.at\/letterswaplibrary\/thoughts-on-this-data-cleaning-project\/","title":{"rendered":"Thoughts On This Data Cleaning Project?"},"content":{"rendered":"<p><!-- SC_OFF --><\/p>\n<div class=\"md\">\n<p>Hi all, I&#8217;m working on a data cleaning project and I was wondering if I could get some feedback on this approach.<\/p>\n<p><strong>Step 1:<\/strong> Recommendations are given for data type for each variable and useful columns. User must confirm which columns should be analyzed and the type of variable (numeric, categorical, monetary, dates, etc)<\/p>\n<p><strong>Step 2:<\/strong> The chatbot gives recommendations on missingness, impossible values (think dates far in the future or homes being priced at $0 or $5), and formatting standardization (think different currencies or similar names such as New York City or NYC). User must confirm changes.<\/p>\n<p><strong>Step 3:<\/strong> User can preview relevant changes through a before and after of summary statistics and graph distributions. All changes are updated in a version history that can be restored.<\/p>\n<p>Thank you all for your help!<\/p>\n<\/div>\n<p><!-- SC_ON -->   submitted by   <a href=\"https:\/\/www.reddit.com\/user\/Academic_Meaning2439\"> \/u\/Academic_Meaning2439 <\/a> <br \/> <span><a href=\"https:\/\/www.reddit.com\/r\/datasets\/comments\/1m0tu0t\/thoughts_on_this_data_cleaning_project\/\">[link]<\/a><\/span>   <span><a href=\"https:\/\/www.reddit.com\/r\/datasets\/comments\/1m0tu0t\/thoughts_on_this_data_cleaning_project\/\">[comments]<\/a><\/span><\/p><div class='watch-action'><div class='watch-position align-right'><div class='action-like'><a class='lbg-style1 like-34702 jlk' href='javascript:void(0)' data-task='like' data-post_id='34702' data-nonce='614a020375' rel='nofollow'><img class='wti-pixel' src='https:\/\/www.graviton.at\/letterswaplibrary\/wp-content\/plugins\/wti-like-post\/images\/pixel.gif' title='Like' \/><span class='lc-34702 lc'>0<\/span><\/a><\/div><\/div> <div class='status-34702 status align-right'><\/div><\/div><div class='wti-clear'><\/div>","protected":false},"excerpt":{"rendered":"<p>Hi all, I&#8217;m working on a data cleaning project and I was wondering if I could get&#8230;<\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[85],"tags":[],"class_list":["post-34702","post","type-post","status-publish","format-standard","hentry","category-datatards","wpcat-85-id"],"_links":{"self":[{"href":"https:\/\/www.graviton.at\/letterswaplibrary\/wp-json\/wp\/v2\/posts\/34702","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.graviton.at\/letterswaplibrary\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.graviton.at\/letterswaplibrary\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.graviton.at\/letterswaplibrary\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/www.graviton.at\/letterswaplibrary\/wp-json\/wp\/v2\/comments?post=34702"}],"version-history":[{"count":0,"href":"https:\/\/www.graviton.at\/letterswaplibrary\/wp-json\/wp\/v2\/posts\/34702\/revisions"}],"wp:attachment":[{"href":"https:\/\/www.graviton.at\/letterswaplibrary\/wp-json\/wp\/v2\/media?parent=34702"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.graviton.at\/letterswaplibrary\/wp-json\/wp\/v2\/categories?post=34702"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.graviton.at\/letterswaplibrary\/wp-json\/wp\/v2\/tags?post=34702"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}