{"id":25160,"date":"2024-01-08T22:27:23","date_gmt":"2024-01-08T21:27:23","guid":{"rendered":"https:\/\/www.graviton.at\/letterswaplibrary\/data-cleaning-time-data-missing-values\/"},"modified":"2024-01-08T22:27:23","modified_gmt":"2024-01-08T21:27:23","slug":"data-cleaning-time-data-missing-values","status":"publish","type":"post","link":"https:\/\/www.graviton.at\/letterswaplibrary\/data-cleaning-time-data-missing-values\/","title":{"rendered":"[Data Cleaning] Time Data Missing Values"},"content":{"rendered":"<p><!-- SC_OFF --><\/p>\n<div class=\"md\">\n<p>Hello, Please I want your help with an issue in a data science project&#8230; In the step of handling missing values, I handle continuous data by replacing it with the mean, but for time data, I don&#8217;t think it&#8217;s the right approach. I found out that there are two ways to do it: Forward Fill (ffill()) or Backward Fill (bfill()) and Linear Interpolation. However, I&#8217;m still wondering which one to use because it&#8217;s the first time I&#8217;m dealing with null values for time data.<\/p>\n<\/div>\n<p><!-- SC_ON -->   submitted by   <a href=\"https:\/\/www.reddit.com\/user\/t_abdessamad\"> \/u\/t_abdessamad <\/a> <br \/> <span><a href=\"https:\/\/www.reddit.com\/r\/datasets\/comments\/191vh8n\/data_cleaning_time_data_missing_values\/\">[link]<\/a><\/span>   <span><a href=\"https:\/\/www.reddit.com\/r\/datasets\/comments\/191vh8n\/data_cleaning_time_data_missing_values\/\">[comments]<\/a><\/span><\/p><div class='watch-action'><div class='watch-position align-right'><div class='action-like'><a class='lbg-style1 like-25160 jlk' href='javascript:void(0)' data-task='like' data-post_id='25160' data-nonce='65e0e39b87' rel='nofollow'><img class='wti-pixel' src='https:\/\/www.graviton.at\/letterswaplibrary\/wp-content\/plugins\/wti-like-post\/images\/pixel.gif' title='Like' \/><span class='lc-25160 lc'>0<\/span><\/a><\/div><\/div> <div class='status-25160 status align-right'><\/div><\/div><div class='wti-clear'><\/div>","protected":false},"excerpt":{"rendered":"<p>Hello, Please I want your help with an issue in a data science project&#8230; In the step&#8230;<\/p>\n","protected":false},"author":0,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[85],"tags":[],"class_list":["post-25160","post","type-post","status-publish","format-standard","hentry","category-datatards","wpcat-85-id"],"_links":{"self":[{"href":"https:\/\/www.graviton.at\/letterswaplibrary\/wp-json\/wp\/v2\/posts\/25160","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.graviton.at\/letterswaplibrary\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.graviton.at\/letterswaplibrary\/wp-json\/wp\/v2\/types\/post"}],"replies":[{"embeddable":true,"href":"https:\/\/www.graviton.at\/letterswaplibrary\/wp-json\/wp\/v2\/comments?post=25160"}],"version-history":[{"count":0,"href":"https:\/\/www.graviton.at\/letterswaplibrary\/wp-json\/wp\/v2\/posts\/25160\/revisions"}],"wp:attachment":[{"href":"https:\/\/www.graviton.at\/letterswaplibrary\/wp-json\/wp\/v2\/media?parent=25160"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.graviton.at\/letterswaplibrary\/wp-json\/wp\/v2\/categories?post=25160"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.graviton.at\/letterswaplibrary\/wp-json\/wp\/v2\/tags?post=25160"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}