Page Menu
Home
Phabricator
Search
Configure Global Search
Log In
Create Task
Maniphest
T139963
Can we switch from rf model to gb to save memory?
Closed, Resolved
Public
Actions
Edit Task
Edit Related Tasks...
Create Subtask
Edit Parent Tasks
Edit Subtasks
Merge Duplicates In
Close As Duplicate
Edit Related Objects...
Edit Commits
Edit Mocks
Edit Revisions
Subscribe
Mute Notifications
Protect as security issue
Award Token
Flag For Later
Tags
Scoring-platform-team (Current)
(Done)
editquality-modeling
(Backlog)
articlequality-modeling
(Backlog)
Spike
User-Ladsgroup
(Incoming)
Subscribers
Aklapper
,
Halfak
,
Zppix
Assigned To
Ladsgroup
Authored By
Halfak
, Jul 11 2016
Description
Look at memory usage
Look at score ranges (do we need to increment version)
Event Timeline
Halfak
created this task.
Jul 11 2016, 4:58 PM
Restricted Application
added subscribers:
Zppix
,
Aklapper
.
·
View Herald Transcript
Jul 11 2016, 4:58 PM
Halfak
claimed this task.
Jul 11 2016, 5:06 PM
Halfak
triaged this task as
Medium
priority.
Danny_B
renamed this task from
[Spike] Can we switch from rf model to gb to save memory?
to
Can we switch from rf model to gb to save memory?
.
Jul 11 2016, 5:42 PM
Danny_B
added a project:
Spike
.
Ladsgroup
moved this task from
Backlog
to
Done
on the
Scoring-platform-team (Current)
board.
Jul 14 2016, 7:34 AM
Halfak
reassigned this task from
Halfak
to
Ladsgroup
.
Jul 16 2016, 12:04 AM
Comment Actions
https://github.com/wiki-ai/editquality/pull/40
Ladsgroup
closed this task as
Resolved
.
Jul 18 2016, 9:09 PM
Phabricator_maintenance
added a project:
User-Ladsgroup
.
Aug 12 2016, 8:08 PM
Log In to Comment