Online MoG-LRMF支蛆是什么意思思?

Meng_Robust_Matrix_Factorization_2013_ICCV_paper-海文库
全站搜索:
您现在的位置:&>&&>&计算机软件及应用
Meng_Robust_Matrix_Factorization_2013_ICCV_paper
2013 IEEE International Conference on Computer VisionRobustMatrixFactorizationwithUnknownNoiseDeyuMengXi’anJiaotongUniversitydymeng@mail.FernandoDelaTorreCarnegieMellonUniversityftorre@cs.cmu.eduAbstractManyproblemsincomputervisioncanbeposedasrecoveringalow-dimensionalsubspacefromhigh-dimensionalvisualdata.Factorizationapproachestolow-ranksubspaceestimationminimizealossfunctionbetweenanobservedmeasurementmatrixandabilinearfactoriza-tion.MostpopularlossfunctionsincludetheL2andL1losses.L2isoptimalforGaussiannoise,whileL1isforLaplaciandistributednoise.However,realdataisoftencorruptedbyanunknownnoisedistribution,whichisun-likelytobepurelyGaussianorLaplacian.Toaddressthisproblem,thispaperproposesalow-rankmatrixfactoriza-tionproblemwithaMixtureofGaussians(MoG)noisemodel.TheMoGmodelisauniversalapproximatorforanycontinuousdistribution,andhenceisabletomodelawiderrangeofnoisedistributions.TheparametersoftheMoGmodelcanbeestimatedwithamaximumlikelihoodmethod,whilethesubspaceiscomputedwithstandardapproaches.Weillustratethebene?tsofourapproachinextensivesyn-theticandreal-worldexperimentsincludingstructurefrommotion,facemodelingandbackgroundsubtraction.??A????9ALE??&??&ACE
??????????;??????????=????????????
;????????=
??Figure1.(a)asthelowerone,withdifferentrangedisplay.(b)Thereconstructedimage(UVT),theerrorimage(E=X?UVT)andhistogramoftheerrorcomputedwiththeL2loss.(c)Sameas(b)butwithL1loss.(d)ThereconstructedimageandthetwoGaussianerrors,withsmallerandlargervariances,obtainedbyourmethod.(Figurebetterseenincolorandtoseedetailszoomonacomputerscreen.)theLowRankMatrixFactorization(LRMF)error:????minU,V??W??(X?UVT)??Lp,(1)1.IntroductionManycomputervision,machinelearningandstatisticalproblemscanbeformulatedasoneoflearningalowdi-mensionallinearmodel.Theselinearmodelshavebeenwidelyusedincomputervisiontosolveproblemssuchasstructurefrommotion[39],facerecognition[43],photo-metricstereo[19],objectrecognition[40],motionsegmen-tation[41]andplane-basedposeestimation[36].LetX=[x1,???,xn]∈??d×n(see1fornotation)beamatrixwhereeachcolumnxiisad-dimensionalmeasure-ment.Standardapproachestosubspacelearningoptimizeuppercaselettersdenotematrices,boldlowercaselettersdenotevectors,andnon-boldlettersrepresentscalarvariables.dianddirepre-senttheithcolumnandrowvectorsofthematrixD,respectively,anddijdenotesthescalarintheithrowandjthcolumnofD.??denotestheHadamardproduct(component-wisemultiplication).Lpdenotesthe??powerpnormofamatrix,thatis??D??Lp=i,j|dij|p.1BoldwhereU∈??d×randV∈??n×rarelow-dimensionalma-trices(r&min(d,n)),WistheindicatormatrixofthesamesizeasX.wij=0ifxijismissingand1otherwise.?????LpdenotesthepthpowerofanLpnorm,andmostpop-ularapproachesuseL2andL1norm.AmainadvantageofminimizingtheL2normisthattheoptimizationproblemissmoothandtherearemultiplefastnumericalsolvers[9,28,31,32,35,44].Aclosed-formsolutionforUandVcanbecomputedwiththeSingularValueDecomposition(SVD)whenalldataisavailable(nomissingdata).However,theL2normisonlyoptimalforGaussiannoiseandprovidesbiasedestimatesinpresenceofoutliersandnon-Gaussiannoisedistributions.Inordertointroducerobustnesstooutliers,theL2canbereplacedbyrobustfunctions[11]ortheL1norm[13,14,18,20,22,39,43].Unfortunately,theseapproachesdonothaveclosed-formsolutionandleadtonon-smoothoptimizationproblems. $31.00 ? 2013 IEEEDOI 10.1109/ICCV.1337WhileL2orL1normsareonlyoptimalifthenoisefol-lowsaGaussianorLaplaciandistribution,thisisnotthecaseinmostrealproblems.Forinstance,considerthecaseoffaceimagesofonesubjecttakenundervaryingillumina-tionconditions(e.g.,Fig.1(a)illustratesoneimageoftheYaleBdatabase[16]).UndertheassumptionthatthefaceisaLambertiansurface,thefacesunderdifferentpointlightsourcescanberecoveredbyathreedimensionalsubspace(e.g.,[34]).Ifthediffusionorbackgroundilluminationisconsideredinthemodelthesubspacewillbeofdimensionfour[21].However,inrealimagestherearedifferenttypesofnoisesources.First,thefaceisnotaperfectLamber-tiansurface,andtherearecastshadows.Second,duetothecamerarangesettingstheremightbepixelsthataresatu-ratedandthereexistspecularre?ections(especiallyinpeo-plewithglasses).Third,thecameranoise(“readnoise”)isampli?edinthedarkareas[30](seeFig.1(a)).Thesedif-ferenttypesofnoisecanhavedifferentdistributions,andminimizingeithertheL2orL1lossisunlikelytoproduceagoodmodeltofactorizetheilluminationcomponent(seeFig.1(b)andFig.1(c),respectively).Toaddressthisissue,thispaperproposesasimplebuteffectiveapproachtoLRMFwithunknownnoise.ThekeyideaistomodelthenoiseasaMixtureofGaussians(MoG)[26],whichisanuniversalapproximatortoanycontinuousdensityfunction[25].Thus,itsubsumespriorpopularL2andL1models(theLaplacedistributioncanbeequivalentlyexpressedasascaledMoG[2]).Fig.1(d)illustrateshowtheproposedMoGnoisemodelcanbetteraccountforthedifferenttypesofnoiseandprovideabet-terestimateoftheunderlyingface.Theparametersofourproposedmodel,subspace-MoG,canbeestimatedwiththetraditionalExpectation-Maximization(EM)underaMaxi-mumLikelihoodEstimation(MLE)framework.Theeffec-tivenessofourMoGmethodisshowninsynthetic,Struc-tureFromMotion(SFM),facemodelingandbackgroundsubtractionexperiments.BuchananandFitzgibbon[6]addedregularizationtermstoEq.(1)andmodi?edtheLevenberg-Marquardt(LM)algo-rithmtoestimatethevariables(U,V)jointly.Chen[9]laterproposedmodi?cationsofLMalgorithmstoimproveitsef-?cientbysolvingsmallerlinearsystemineveryiteration.OkataniandDeguchi[31]showedthataWibergmarginal-izationstrategyonUorVprovidesarobustinitialization,butitshighmemoryrequirementsmakeitimpracticalformedium-sizedatasets.Aguiaretal.[1]introducedaglob-allyoptimalsolutiontoL2LRMFwithmissingdataundertheassumptionthatthemissingdatahasaspecialYoungdiagramstructure.Morerecently,ZhaoandZhang[44]introducedtheSALSmethodthatconstrainsthecompo-nentsofXliewithinarange,andconsiderstheL2LRMFasaconstrainedmodel.Mitraetal.[28]showedthatthematrixfactorizationproblemcanbeformulatedasalow-ranksemide?niteprogramandproposedanaugmentedLa-grangianmethod.However,alloftheseworksminimizeanL2errorthatisonlyoptimalforGaussiannoise.Inordertointroducerobustnesstooutliers,KeandKanade[20]suggestedreplacingtheL2losswiththeL1norm,minimizedbyalternatedlinearorquadraticprogram-ming(ALP/AQP).Amoreef?cientmethodcalledPCAL1wasthenproposedbyKwak[22].ThismethodmaximizestheL1normoftheprojecteddata.SimilarlytotheL2Wibergapproach[31],ErikssonandHengel[14]experi-mentallyshowedthatthealternatedconvexprogrammingapproachfrequentlydoesnotconvergetothedesiredpoint,andintroducedtheL1Wibergapproachtoaddressthis.Veryrecently,Zhengetal.[45]extended[14]byaddinganu-clearnormregularizeronVandtheorthogonalitycon-straintsinU,whichresultedinimprovementsonthestruc-turefrommotionproblem.Inthecompressedsensinglit-erature,Wrightetal.[43]proposedaRobustPCAmethodusingrecentadvancesinrankminimization.Amajorad-vantageofthisapproachliesinitsconvexformulationeveninthecaseofsparseoutliersandmissingdata.Thesemeth-ods,however,optimizeanL1normerrorandarethusonlyoptimalforLaplaciannoise.BeyondthesedeterministicLRMFmethods,therehasbeenseveralprobabilisticextensionsofmatrixfactoriza-tions.Factoranalysis(FA)[4]isaprobabilisticexten-sionofPCAthatassumesnormallydistributedcoef?cients(U)andadiagonalGaussiannoisemodel.AninstanceofFAistheprobabilisticPrincipalComponentAnalysis(PPCA)[29,33,38]model.UnlikeFA,PPCAassumesanisotropicGaussiannoisemodels.OtherprobabilisticextensionsincludethemixtureofPPCA[37]thatextendsPPCAbyconsideringamixturemodelinwhichthecom-ponentsareprobabilisticPCAmodels(MixturePCA).Re-cently,someprobabilisticframeworksforrobustmatrixfac-torization[42,23,3]havebeenfurtherproposedandmodelthenoisewithaLaplacianorstudent-tdistributions.Unlike2.PreviousworkTheL2-normLRMFwithmissingdatahasbeenstud-iedinthestatisticalliteraturesincetheearly80’s.GabrielandZamir[15]proposedaweightedSVDtechniquethatusedalternatedminimization(orcriss-crossregression)to?ndtheprincipalsubspaceofthedata.DelaTorreandBlack[11]proposedRobustPCAbychangingtheL2normtoarobustfunctiontodealwithoutliers.TheyusedtheIteratively-ReweightedLeast-Squares(IRLS)tosolvetheproblem.ThisapproachcanhandlemissingdatabysettingweightstozerointheIRLSalgorithm,butitispronetolocalminima.SrebroandJaakkola[35]proposedtheWeightedLow-rankApproximation(WLRA)algorithm,thatusesEMorconjugategradientdescentdependingonthecomplex-ityofthestructureoftheproblem.Toavoidlocalminima,1338previouswork,wemodelournoiseasaMoGandnotapar-ticularunimodaldistribution.LRMFparametersU,V,thatis:U,V,Π,ΣmaxL(U,V,Π,Σ)=??i,j∈Ω3.LRMFwithMoGnoiseThissectionproposesanewLRMFmethodwithaMoGnoisemodel,anewmatrixfactorizationmethodthatac-countsformulti-modalnoisedistributions.logK??k=12πkN(xij|(ui)Tvj,σk).(3)Inthefollowingwewillrefertotheproblem(3)asthesubspace-MoGmodel.3.1.Thesubspace-MoGmodelInLRMF,eachelementxij(i=1,2,???,d,j=1,2,???,n)oftheinputmatrixXcanbemodeledasxij=(ui)Tvj+εij,(2)3.2.EMalgorithmTheEM[12]algorithmcanbeusedtoestimatethepa-rameters(U,V,Π,Σ)thatmaximizethelikelihoodfunc-tionofthesubspace-MoGmodel.RecallthatinthestandardEMalgorithmforMoGthereisameanforeachclusterandinourcaseallclusterssharethevariablesU,V.Ourpro-posedalgorithmwilliteratebetweencalculatingresponsi-bilitiesofallGaussiancomponents(EStep)andmaximiz-ingtheparametersΠ,ΣandU,Vofthemodel(MStep).EStep:Assumealatentvariablezijkinthemodel,with??Kzijk∈{0,1}andk=1zijk=1,indicatingtheassign-mentofthenoiseεijtoaspeci?ccomponentofthemixture.Theposteriorresponsibilityofmixturek(=1,2,???,K)forgeneratingthenoiseofxij(i=1,2,???,d,j=1,2,???,n)isthencalculatedby([12]):E(zijk)=γijk=2)πkN(xij|(ui)Tvj,σk.K??2)πkN(xij|(ui)Tvj,σkwhereuiandviaretheithrowvectorsofUandV,re-spectively,andεijdenotesthenoiseinxij.ItcanbeeasilyshownthattheL2orL1LRMFmodel(1)correspondstotheMLEoftheproblemwhenεijisindependentlysam-pledfromaGaussianorLaplacedistribution,respectively.Todealwithmorecomplexproblemsincomputervision,itisnaturaltouseaMoGtomodelthenoise.Sinceitisanuniversalapproximatortoanycontinuousdistributions[25].Forinstance,aLaplaciandistributioncanbeequivalentlyexpressedasascaledMoG[2].Therefore,inthispaperwewillassumethateachεijinEq.(2)isasamplefromaMoGdistributionp(ε),de?nedasK??2πkN(ε|0,σk),p(ε)~k=1(4)k=1TheMstepmaximizestheupperboundgivenbytheE-stepw.r.t.U,V,Π,Σ[12]:EZp(X,Z|U,V,Π,Σ)=????K????√(xij?(ui)Tvj)2γijklogπk?log2πσk?.(5)22πσki,j∈Ωk=1whereN(ε|0,σ2)denotestheGaussiandistributionwithmean0andvarianceσ2.πk≥0isthemixingproportion??Kwherek=1πk=1.Then,theprobabilityofeachelementxijofXcanbewrittenasp(xij|u,v,Π,Σ)ij=K??k=1πkp(xij|k),2=N(xij|(ui)Tvj,σk),Π=wherep(xij|k){π1,π2,???,πK},andΣ={σ1,σ2,???,σK}.ThelikelihoodofXcanthenbewrittenas??p(xij|(ui)Tvj,Π,Σ)p(X|U,V,Π,Σ)=i,j∈ΩK????i,j∈Ωk=1Aneasywaytosolvethismaximizationproblemistoalter-nativelyupdatetheMoGparametersΠ,Σandthefactor-izedmatricesU,Vasfollows:UpdateΠ,Σ:Closed-formupdatesfortheMoGparam-eters(fork=1,2,???,K)are[12]:??Nkγijk,πk=??Nk=Nk,i,j2σk=1Nk??i,jγijk(xij?(ui)Tvj)2.(6)=2πkN(xij|(ui)Tvj,σk),whereΩistheindexsetofthenon-missingentriesinX.Giventhelikelihood,ouraimistomaximizethelog-likelihoodfunctionw.r.ttheMoGparametersΠ,ΣandtheUpdateU,V:ThecomponentsofEq.(5)relatedtoUandVcanthenbere-writtenasfollows:????K????(xij?(ui)Tvj)2γijk?22πσki,j∈Ωk=1??K??????γijk(xij?(ui)Tvj)2=?22πσki,j∈Ωk=1??????T??=???W??(X?UV)??,(7)L21339wheretheelementwijofW∈??d×nis???K????γijk2,i,j∈Ω2πσk.wij=k=1??0,i,j∈/Ω(8)Itisinterestingtoobservethatthemaximizationof(7)isexactlyequivalenttotheWeightedL2LRMFproblem.Wecanuseanyoff-the-shelfalgorithms,suchastheAlternatedLeastSquares(ALS)[11],WLRA[35]andDN[6]toup-dateU,Vinourmethod.WeadoptedtheALSduetoitssimplicityofimplementationandgoodperformance.TheoptimizationprocessissummarizedinAlgorithm1.Algorithm1:MoGalgorithmforLRMFInput:X=(x1,x2,???,xn)∈Rd×n,indexsetΩofnon-missingentriesofXOutput:U,VRandomlyinitializeΠ,Σ,U,V,MoGnumberK,smallthresholdε.repeat(EStep):Evaluateγijkfori=1,2,???,d;j=1,2,???,n;k=1,2,???,KbyEq.(4).2(MStepforΠ,Σ):Evaluateπk,σkfork=1,2,???,KbyEq.(6).(MStepfor??U,V):EvaluateU??,Vbysolving????minU,V??W??(X?UVT)??throughALS,whereWiscalculatedbyEq.(8).22?σj|σi|(AutomaticKtuning):Ifσ2+σ2&εforsomei,j,thencombinetheiandjGaussiancomponentsintoauniqueGaussianbyletting222=(niσi+njσj)/(ni+nj),πi=πi+πj,σiwhereniistheelementnumberinithGaussiancomponent,andremovingthejthGaussianparametersfromΠ,Σ.LetK=K?1.thiLocalminima:Ouralgorithmisiterativeinnature,andineachiterationweareguaranteedtonotdecreasetheen-ergyofthelog-likelihoodfunction(Eq.(3)).However,thelog-likelihoodissubjecttolocalmaxima([12]).Acom-monlyusedstrategytoalleviatethisproblemistoapplymultiplerandominitializations,andselecttheonewiththelargestlog-likelihood.Terminationconditions:WestopthealgorithmwhenthechangeinUbetweenconsecutiveiterationsissmallerthanapre-speci?edsmallthreshold,orthemaximumnum-berofiterationsisreached.Robustnesstooutliers:L2LRMFisgenerallyconsid-eredtobesensitivetooutliers.Inoursubspace-MoGmodel,however,anoutlierwillbelongtoamixturewithlargevari-ance,andwijwillhaveasmallvaluebasedonEq.(8).Thiswillreducethein?uenceoftheoutlierinthesolution.4.ExperimentsToevaluatetheperformanceoftheproposedsubspace-MoGmethod,weconductedextensivesynthetic,StructureFromMotion(SFM),facemodelingandbackgroundsub-tractionexperiments.InthesyntheticandSFMexperimentsweanalyzedtheperformanceofouralgorithminsituationswhenthegroundtruthisknownandweaddeddifferenttypesofnoisetoit.TheexperimentsinfacemodelingandbackgroundsubtractionillustratehowthetheMoGnoisemodelisarealisticassumptionforvisualdata.AllmethodswereimplementedinMatlabR2011bandrunonaPCwithIntelQG(CPU)and4GBofRAM.Toproperlymeasurethecapabilityofvariousnon-convexLRMFoptimizationmodels,allcompetingmethods(exceptSVDandnuclear-normbasedRobustPCA[43])arerunwith10randominitializations,andthebestresultisse-lected.Allmethodsrunamaximumof100iterationsorstopwhenthedifferencebetweenconsecutiveUsissmallerthan0.01.Inallexperiments,wecomparedourapproachwithsev-eralLRMFmethodsincludingRobustPCA(IRLS)[11]2,nuclear-normbasedRobustPCA[43](NN-RobustPCA)3,tworepresentativemethodsforL2LRMF:WLRA[35]andDN[6]4;fourstate-of-the-artmethodsforL1LRMF:ALP[20]5,L1Wiberg[14]6,RegL1ALM[45]7andCWM[27].Becausecodewasnotavailable,weimple-mentedWLRA[35].TheNN-Robustmethodprovidestherankofthematrixasafunctionoftheregularizationparam-2http://www.cs.cmu.edu/?ftorre/3http://perception.csl.illinois.edu/matrix-rank/sample4http://www.robots.ox.ac.uk/?abm/5We12345L26thj73.3.Otherdetailsofthesubspace-MoGmodelNumberofGaussiancomponents:Weproposeasim-plebuteffectivemethodtoautomaticallyestimatethenum-berofGaussiansinourmodel.WestartwithagivennumberofGaussianmixtures(e.g.,K=6)thatislargeenoughto?tthenoisedistributioninallourexperiments.Aftereachiteration(EandMstep),wecheckiftherelativedeviation22?σj||σibetweenvariancesoftwoGaussiancomponentsis22σi+σjsmallerthansomesmallthresholdε(ε=0.1forallex-periments).Ifso,thetwomixturesarenaturallyseenasasimilarGaussianandcanbecombined.ThenumberKisthusreducedtoK?1.code.htmlusedthecode“l1decodepd.m”[8]forsolvingthelin-earprogrammingproblem.Thecodewasdownloadedfrom“http://www-inst.eecs.berkeley.edu/?ee225B/sp08/lectures/CSmeetsML-Lecture1/codes/l1magic/Optimization”.6http://cs.adelaide.edu.au/?anders/code/cvpr2010.html7/site/yinqiangzheng/1340eters.Insomeexperimentstherankisknownapriori,andwehaveselectedtheregularizationparameterthatsatis?estherequiredrank.4.1.SyntheticexperimentsFoursetsofsyntheticexperimentsweredesignedtoeval-uatetheperformanceofourmethodagainstotherLRMFmethodswithdifferenttypesofnoise.Foreachsetofex-periments,werandomlygenerated30low-rankmatrices,eachofsize40×20andrank4.Eachofthesematricesisgeneratedbythemultiplicationoftwolow-rankmatri-cesUgt∈??40×4andVgt∈??20×4,andXgt=UgtVTgtisthegroundtruthmatrix.EachelementofUgtandVgtisgeneratedfromaGaussiandistributionN(0,1).Ineachexperiment,werandomlyspeci?ed20%ofmissingentriesinXgtandfurtheraddeddifferenttypesofnoiseasfol-lows:(1)Nonoiseadded.(2)GaussiannoiseN(0,0.1).(3)Sparsenoise:20%oftheentrieswerecorruptedwithuniformlydistributednoisebetween[?5,5].(4)Mixturenoise:20%oftheentrieswerecorruptedwithuniformlydistributednoiseover[?5,5],20%arecontaminatedwithGaussiannoiseN(0,0.2),andtheremaining40%arecor-ruptedGaussiannoiseN(0,0.01).Thenoisymatrixisde-notedasXno.The?nalperformanceofeachmethodoneachexperimentwasmeasuredastheaverageoverthe30realizationsandtheerrormeasuredwithsixmeasures:ortrackingfailures.Weaddedfourtypesofnoisetothematrix:(1)Nonoiseadded.(2)GaussiannoiseN(0,10).(3)Sparsenoise:10%ofthenon-missingelementswerecorruptedbyuniformlydistributednoise([?50,50]).(4)Mixturenoise:10%ofthenon-missingelementswerecorruptedbyuniformlydistributednoise([?50,50]),theremaining90%werecontaminatedwithGaussiannoiseN(0,10).Fourquantitativecriteriawereutilizedforper-formanceevaluationintheseexperiments,including:??????????????V??T)????V??T)??E1=??W??(Xno?U,E2=??W??(Xno?U,????L1L2??????????????V??T)????V??T)??E3=??W??(Xgr?U,E4=??W??(Xgr?U,????L1L2??,V??aretheoutputsofthecorrespondingLRMFwhereUmethod.Wecomparedwiththesamemethodsasinthepreviousexperiments,withexceptionofL1Wibergbecauseitdidnot?tintomemory.TheresultsareshowninTable2.Similartooursyntheticexperiments,theproposedsubspace-MoGmethoddoesnotperformbestamongallcompetingmethodsintermsofE1andE2.However,italwayshasthebestorthesecondbestperformanceintheerrorE3-E6.Especially,itperformsthebestintermsofbothE3andE4inthemixturenoiseexperiments.4.3.FacemodelingexperimentsThisexperimentaimstotesttheeffectivenessofthepro-posedMoGmethodtobuildamodelofthefaceunderdif-ferentilluminations.Thedatamatrixisgeneratedbyex-tractingthe?rstsubsetoftheExtendedYaleBdatabase[16,5],containing64facesofonesubjectwithsize192×168underdifferentilluminations.Theinputmatrixhasasizeof32256×64.Typicalimagesusedareplottedinthe?rstcolumnofFig.2.WecomparedwithRobustPCA(IRLS),NN-RobustPCA[43],SVD[17],RegL1ALM[45],CWM[22]andPCAL1[22].TheSVDmethodisimple-mentedwiththeMatlabfunction.Assumingperfectcondi-tions,thesefaceimageswouldlieona4-Dsubspace[5].wethussettherankto4forallcompetingmethodsexceptnuclear-normbasedRobustPCA,whichautomaticallyse-lectstherank.Fig.2comparessomereconstructedimagesobtainedbythesecompetingmethods.FromFig.2,itiseasytoobservethattheproposedmethod,aswellastheothercompetingmethods,areca-pableofremovingthecastshadowsandsaturationsinthefaces,asshowninthe?rstrowofFig.2.However,ourmethodperformsbetteroverthefaceswithlargedarkre-gions,asshowninthe2on?4throwsofFig.2.ThiscanbeeasilyexplainedbyFig.1(d).Unlikeexistingmethods,ourapproachisabletomodelamixtureofGaussiansnoise.Onemixture,withlargevariancemodelsshadows,saturatedregionsandoutliersintheface(seetheblueandyellowar-easinthesecondnoiseimageofFig.1(d)).Theothermix-ture,withsmallervariance,accountsforthecameranoise??????????????V??T)????V??T)??E1=??W??(Xno?U??,E2=??W??(Xno?U??,L1L2??????????????V??T????V??T??E3=??Xgt?U??,E4=??Xgt?U??,L1L2??),E6=subspace(Vgt,V??),E5=subspace(Ugt,U??,V??aretheoutputsofthecorrespondingLRMFwhereUmethod,andsubspace(U1,U2)denotestheanglebetweensubspacesspannedbythecolumnsofU1andU2.Itisim-portanttonoticethatexistingmethodsoptimizeE1orE2,butthelastfourmeasures(E3?E6)aremoremeaningfultoevaluateifthemethodrecoversthecorrectsubspace.TheperformanceofthemethodsareshowninTable1.ItcanbeobservedfromTable1thatalthoughtheL1andL2LRMFmethodsgenerallyperformbetterintermsofE1andE2,respectively,theproposedsubspace-MoGmethodperformsbestorthesecondbestinallexperimentsines-timatingabettersubspacefromnoisydata(measurementsE3-E6).Particularly,inthefourthsetofexperiments(whenthenoiseisamixture)ourmethodalwaysperformsbestinallerrors.4.2.SFMexperimentsTheSFMproblemcanbeformulatedasaLRMFtask[20,14].Weusedthewellknowndinosaursequence[20]thatcontainsprojectionsof319pointstrackedover36frames,leadingtoa319×72matrix.Thema-trixcontainsaround77%missingdataduetoocclusions1341E2E3E4E5E6E1E2E3E4E5E6E1E2E3E4E5E6E1E2E3E4E5E6?42.2?17.9?40.5?23.8?23.940.03..0.00.90...0.15511?56.8?25.9?55.2?32.5?31.940.03..9.91.10..56.10.35?9.38?1.50?7.06?7.05?8.9.33.290.8.002.30..51.70.6?27.6?11.0?27.3?18.9?19.440.03..9.95.40..56.20.?4.27?21.11.03?8.44?3.37?20.7?5.28?14.6?6.18?16.1GaussianNoise37.535.75.205..314.910..SparseNoise403...967..40.0284MixtureNoise412...13.50.00.17545?30.0?11.9?27.9?17.4?18.435.75..5..0.24..40.27?4.830.324?3.65?5.35?5.1.96.380.5..0.20..7?10.8?2.20?8.73?7.75
?8.9.33.290.3.76.050..55.90.Table1.Sixmeasuresoferrorforthesyntheticexamplewithdifferentnoisemodels.Thebestandthesecondbestresultsineachexperimentarehighlightedinboldanditalic,respectively.MoGE1E2E3E4E1E2E3E4E1E2E3E4E1E2E3E40.1.136.708.484.555.873.249.311.294.288..81IRLS[11]1.833.181.833.187.038.935.016.434.958.274.086.138..88WLRA[35]8.2.111.214.99.4.909.6.99.9913.8ALP[20]RegL1ALM[45]NoNoise0.1..1.GaussianNoise6.738.486.148..498.055.395.SparseNoise4.536.422.857..584.910..98MixtureNoise8.0..839.196.317.DN[6]CWM[27]7.7126.47..911.620.710.319.69..811.721.5NN-RobustPCA[7]4.987.964.987.969.0.27...199.83Table2.PerformancecomparisonofthecompetingmethodsintheSFMexperiments.Thebestandthesecondbestresultsineachexperimentarehighlightedinboldanditalic,respectively.Original faceMoGIRLSSVDRegL1ALMPCAL1CWMNN-Robust PCAwhicharespeciallyampli?edinthedarkareaoftheface(seethe?rstnoiseimageofFig.1(d)).4.4.BackgroundsubtractionexperimentsThisexperimentcomparesourapproachintheproblemofbackgroundsubtraction[11].WebuiltabackgroundmodelbyperformingLRMFinsevenvideosequencespro-videdby[24]8(600framesof128×160pixels)andonein[11]9(506framesof120×160pixels).Thesequencesincludevariationsduetolightingchanges,peoplewalking,shadows,etc.SeeFig.3andFig.4.8http://perception.i2r.a-star.edu.sg/bk9http://www.cs.cmu.edu/?ftorre/codedata.htmlFigure2.Fromlefttoright:originalfaceimages,facesre-constructedbyMoG,RobustPCA(IRLS),SVD,RegL1ALM,PCAL1,CWMandnuclear-normbasedRobustPCA,respectively.model/bkindex1342WeappliedRobustPCA(IRLS)[11];NN-RobustPCA[43];thestate-of-the-artmethodforL2LRMF:SVD[17];threestate-of-the-artmethodsforL1LRMF:RegL1ALM[45],CWM[22],PCAL1[22];andoursubspace-MoG(MoG).Thedimensionofthesubspacewassetto6tothevideosfrom[24]and15forthevideoin[11].Fig.3illustratestheresultsofrunningdifferentLRMFmethodsinthevideosprovidedby[24].Weobservethatallmethodscanprovideagoodbackgroundmodel.However,theproposedsubspace-MoGmethodprovidesamoreac-curatemodelthatdecomposestheforegroundinformationintothreecomponentswithdifferentvariancefromsmalltolarge:(1)backgroundvariationcorrespondin(2)shadowsalongsid(3)movingobjectsintheforeground(see2on?4throwsofFig.3).Theforegroundextractedbytheothercompetingmethodsismorecoarsebecauseitmergestheobjectanditsshadow(seeFrame402foreasyvisualization).Anotherexampleoftheperformanceofsubspace-MOGcanbeinFig.4(videofrom[11]).Ourmethodreconstructsbettertheilluminationvariationsandisnotbiasedbytherandompeoplewalking,shadows,specularre?ections,ormotionofthetree.References[1]P.M.Q.Aguiar,J.M.F.Xavier,andM.Stosic.Spectrallyoptimalfactorizationofincompletematrices.InCVPR,]D.AndrewsandC.Mallows.Scalemixturesofnormaldistributions.JournaloftheRoyalStatisticalSociety,SeriesB,36(1):99C102,[3]C.Archambeau,N.Delannay,andM.Verleysen.Robustprobabilisticprojec-tions.InICML,]D.J.Bartholomew.LatentVariableModelsandFactorAnalysis.CharlesGrif?n,]R.BasriandD.W.Jacobs.Lambertianre?ectionandlinearsubspaces.IEEETransactionsPatternAnalysisandMachineIntelligence,25:218C233,]A.BuchananandA.Fitzgibbon.DampedNewtonalgorithmsformatrixfac-torizationwithmissingdata.InCVPR,,6[7]E.Cand`es,X.D.Li,Y.Ma,andJ.Wright.Robustprincipalcomponentanaly-sis?JournaloftheACM,58,]E.Cand`esandJ.Romberg.l1-MAGIC:recoveryofsparsesignalsviaconvexprogramming.TechnicalReport,CaliforniaInstituteofTechnology,]P.Chen.Optimizationalgorithmsonsubspaces:Revisitingmissingdataprob-leminlow-rankmatrix.InternationalJournalofComputerVision,80:125C142,[10]F.DelaTorre.Aleast-squaresframeworkforcomponentanalysis.IEEETrans-actionsPatternAnalysisandMachineIntelligence,34(6):12.7[11]F.DelaTorreandM.J.Black.Aframeworkforrobustsubspacelearning.InternationalJournalofComputerVision,54:117C142,,4,6,7[12]A.P.Dempster,N.M.Laird,andD.B.Rubin.Maximumlikelihoodfromincompletedataviatheemalgorithm.JournaloftheRoyalStatisticalSociety,B,39(1):1C38,[13]C.Ding,D.Zhou,X.F.He,andH.Y.Zha.R1-PCA:Rotationalinvariantl1-normprincipalcomponentanalysisforrobustsubspacefactorization.InICML,]A.ErikssonandA.vandenHengel.Ef?cientcomputationofrobustlow-rankmatrixapproximationsinthepresenceofmissingdatausingthel1norm.InCVPR,,4,5,6[15]K.R.GabrielandS.Zamir.Lowerrankapproximationofmatricesbyleastsquareswithanychoiceofweights.Technometrics,21:489C498,]A.S.Georghiades,P.N.Belhumeur,andD.J.Kriegman.Fromfewtomany:Illuminationconemodelsforfacerecognitionundervariablelightingandpose.IEEETransactionsPatternAnalysisandMachineIntelligence,23:643C660,[17]G.H.GolubandC.F.vanLoan.MatrixComputation.Maryland:JohnsHop-kinsUniversityPress,[18]H.Ji,C.Q.Liu,Z.W.Shen,andY.H.Xu.Robustvideodenoisingusinglowrankmatrixcompletion.InCVPR,]C.Julia`,F.Lumbreras,andA.D.Sappa.Afactorization-basedapproachtophotometricstereo.InternationalJournalofImagingSystemsandTechnology,21:115C119,]Q.F.KeandT.Kanade.Robustl1normfactorizationinthepresenceofoutliersandmissingdatabyalternativeconvexprogramming.InCVPR,,4,5,6[21]J.J.KoenderinkandA.J.vanDoorn.Thegenericbilinearcalibration-estimationproblem.InternationalJournalofComputerVision,23(3):217C234,]N.Kwak.Principalcomponentanalysisbasedonl1-normmaximization.IEEETransactionsPatternAnalysisandMachineIntelligence,30:08.1,2,5,7[23]B.Lakshminarayanan,G.Bouchard,andC.Archambeau.Robustbayesianmatrixfactorisation.InAISTATS,]L.Y.Li,W.M.Huang,I.Gu,andQ.Tian.Statisticalmodelingofcomplexbackgroundsforforegroundobjectdetection.IEEETransactionsonImageProcessing,13(11):04.6,7[25]V.Maz’yaandG.Schmidt.Onapproximateapproximationsusinggaussiankernels.IMAJournalofNumericalAnalysis,16(1):13C29,[26]G.J.McLachlanandK.E.Basford.MixtureModels:InferenceandApplica-tionstoClustering.MarcelDekker,]D.Y.Meng,Z.B.Xu,L.Zhang,andJ.Zhao.Acyclicweightedmedianmethodforl1low-rankmatrixfactorizationwithmissingentries.AAAI,[28]K.Mitra,S.Sheorey,andR.Chellappa.Large-scalematrixfactorizationwithmissingdataunderadditionalconstraints.InNIPS,5.ConclusionsThispaperproposesanewlow-rankfactorizationmethodtoestimatesubspaceswithanunknownnoisedistri-bution.ThenoiseismodeledasaMoG,andtheparametersofthesubspace-paredtoexistingL2andL1LRMFmethodsthatareoptimalforGaussianorLaplaciannoises,ourmethodperformsbetter(onaverage)inawidevarietyofsyntheticandrealnoiseexperiments.Ourmethodhasprovenusefulinmodelingdifferenttypesofnoiseinfacesunderdiffer-entilluminationsandbackgroundsubtraction.Alimitationofourapproachisthenon-convexityofthecostfunction.Currently,weareexploringspectralapproachestoimproverobustnesstolocalminima.Finally,addingrobustnesstodifferenttypesofnoisecanbesimilarlyappliedtoothercomponentanalysismethods(e.g.,lineardiscriminantanal-ysis,normalizedcuts)thatareformulatedasleast-squaresproblems[10].AcknowledgementsThisresearchwassupportedby973ProgramofChinawithNo.9404andtheNSFCprojectswithNo.5.FernandoDelaTorrewaspartiallysupportedbyGrantCPS-0931999andNSFIIS-1116583.Anyopinions,?ndingsandconclusionsorrecommendationsexpressedinthismaterialarethoseoftheauthor(s)anddonotnecessarilyre?ecttheviewsoftheNa-tionalScienceFoundation.1343NN-Robust PCA
Original Frame
sFrame 386
Frame 402Figure3.Fromtoprowtobottom:originalLobbyframes,absoluteerrorscomputedbydifferentmethods(thedetailsarebetterseenbyzoomingonacomputerscreen).ThemovingobjectregioninFrame402isenlargedforbetter
visualization.
Original Frame
NN-Robust PCAFigure4.Fromlefttoright:originalframes,reconstructedbackgroundscomputedbydifferentmethods.[29]B.MoghaddamandA.Pentland.Probabilisticvisuallearningforobjectrep-resentation.IEEETransactionsPatternAnalysisandMachineIntelligence,19:696C710,]J.Nakamura.ImageSensorsandSignalProcessingforDigitalStillCameras.CRCPress,]T.OkataniandK.Deguchi.Onthewibergalgorithmformatrixfactorizationinthepresenceofmissingcomponents.InternationalJournalofComputerVision,72:329C337,[32]T.Okatani,T.Yoshida,andK.Deguchi.Ef?cientalgorithmforlow-rankmatrixfactorizationwithmissingcomponentsandperformancecomparisonoflatestalgorithms.InICCV,]S.Roweis.EMalgorthmsforPCAandSPCA.InNIPS,1998.2[34]A.Shashua.Onphotometricissuesin3dvisualrecognitionfromasingle2dimage.InternationalJournalofComputerVision,21:99C122,]N.SrebroandT.Jaakkola.Weightedlow-rankapproximations.InICML,2003.1,2,4,6[36]P.Sturm.Algorithmsforplane-basedposeestimation.InCVPR,]M.E.TippingandC.M.Bishop.Mixturesofprobabilisticprincipalcomponentanalysers.NeuralComputation,11:443C482,]M.E.TippingandC.M.Bishop.Probabilisticprincipalcomponentanalysis.JournaloftheRoyalStatisticalSociety:SeriesB,61:611C622,]C.TomasiandT.Kanade.Shapeandmotionfromimagestreamsunderor-thography:afactorizationmethod.InternationalJournalofComputerVision,9:137C154,1992.1[40]M.TurkandA.Pentland.Eigenfacesforrecognition.JournalofCognitiveNeuroScience,3:71C86,]R.Vidal,R.Tron,andR.Hartley.Multiframemotionsegmentationwithmiss-ingdatausingpowerfactorizationandGPCA.InternationalJournalofCom-puterVision,79:85C105,]N.Y.Wang,T.S.Yao,J.D.Wang,andD.Y.Yeung.Aprobabilisticapproachtorobustmatrixfactorization.InECCV,]J.Wright,Y.G.Peng,Y.Ma,A.Ganesh,andS.Rao.Robustprincipalcom-ponentanalysis:Exactrecoveryofcorruptedlow-rankmatricesbyconvexop-timization.InNIPS,,4,5,7[44]K.ZhaoandZ.Y.Zhang.Successivelyalternateleastsquareforlow-rank<puterVisionandImageUnderstanding,114:10.1,2[45]Y.Q.Zheng,G.C.Liu,S.Sugimoto,S.C.Yan,andM.Okutomi.Practicallow-rankmatrixapproximationunderrobustl1-norm.InCVPR,,5,6,71344
上一篇: 下一篇:
All rights reserved Powered by
copyright &copyright 。文档资料库内容来自网络,如有侵犯请联系客服。}

我要回帖

更多关于 emmmm是什么意思 的文章

更多推荐

版权声明:文章内容来源于网络,版权归原作者所有,如有侵权请点击这里与我们联系,我们将及时删除。

点击添加站长微信