SOLUTION MANUAL
Linear Algebra and Optimization for Machine
Learning
1st Edition by Charu Aggarwal. Chapters 1 – 11
vii
,Contents
1 LinearV AlgebraV andV Optimization:V AnV Introduction 1
2 LinearV TransformationsV andV LinearV Systems 17
3 Diagonalizable V MatricesV andV Eigenvectors 35
4 OptimizationVBasics:VAVMachineVLearningVView 47
5 OptimizationV ChallengesV andV AdvancedV Solutions 57
6 LagrangianV RelaxationV andV Duality 63
7 SingularV ValueV Decomposition 71
8 MatrixV Factorization 81
9 TheV LinearV AlgebraV ofV Similarity 89
10 TheV LinearV AlgebraV ofV Graphs 95
11 OptimizationV inV ComputationalV Graphs 101
viii
,ChapterV 1
LinearVAlgebraVandVOptimization:VAnVIntroduction
1. ForV anyV twoV vectorsV xV andV y,V whichV areV eachV ofV lengthV a,V showV thatV (i)
V xV− Vy V isVorthogonal VtoVxV+Vy,V and V(ii) V the Vdot Vproduct Vof Vx V− V3y V and Vx V+V
3yV isV negative.
(i)VTheVfirstVisVsimply·V −VVx·V xV yV yVusingVtheVdistributiveVpropertyVofVmatrix
Vmultiplication.VTheVdotVproductVofVaVvectorVwithVitselfVis Vits VsquaredVle
ngth.VSinceVbothVvectorsVareVofVtheVsameVlength,VitVfollowsVthatVtheVresu
ltVisV0.V(ii)VInVtheVsecondVcase,VoneVcanVuseVaVsimilarVargumentVtoVshowVt
hatVtheVresultVisVa2V−V9a2,VwhichVisVnegative.
2. ConsiderV aV situationV inV whichV youV haveV threeV matricesV A,V B,V andV C,V ofV size
sV 10V×V2,V2V×V10,VandV10V×V10,Vrespectively.
(a) SupposeVyouVhadVtoVcomputeVtheVmatrixVproductVABC.VFromVanVefficien
cyVper-
Vspective,VwouldVitVcomputationallyVmakeVmoreVsenseVtoVcomputeV(AB)CVor
VwouldVit Vmake Vmore VsenseVtoVcompute VA(BC)?
(b) IfVyouVhadVtoVcomputeVtheVmatrixVproductVCAB,VwouldVitVmakeVmoreVse
nseVtoVcomputeV (CA)BV orV C(AB)?
TheVmainVpointVisVtoVkeepVtheVsizeVofVtheVintermediateVmatrixVasVsm
allVasVpossibleV inVorderVtoVreduceVbothVcomputationalVandVspaceVrequ
irements.VInVtheVcaseVofVABC,VitVmakesVsenseVtoVcomputeVBCVfirst.VInV
theVcaseVofVCABVitVmakesVsenseVtoVcomputeVCAVfirst.VThisVtypeVofVass
ociativityVpropertyVisVusedVfrequentlyVinVmachineVlearningVinVorderVt
oVreduceVcomputationalVrequirements.
3. ShowV thatV ifV aV matrixV AV satisfiesV—AV =
ATV,V thenVallVtheV diagonalVelementsV of
V the Vmatrix Vare V0.
NoteVthatVAV+VATV=V0.VHowever,VthisVmatrixValsoVcontainsVtwiceVtheV
diagonalVelementsVofVAVonVitsVdiagonal.VTherefore,VtheVdiagonalVelem
entsVofVAVmustVbeV0.
4. ShowVthatVifVweVhaveVaVmatrixVsatisfying
— VAV=
1
, ATV,VthenVforVanyVcolumnVvectorVx,
weVhaveV x VAxV=V0.
V
T
NoteV thatV theV transposeV ofV theV scalarV xTVAxV remainsV unchanged.V Therefore,V
weV have
xTVAxV=V(xTVAx)TV =VxTVATVxV=V−xTVAx.V Therefore,V weV haveV 2xTVAxV=V0
.
2