Projection video display apparatus and image adjustment method

Abstract

PROBLEM TO BE SOLVED: To provide a projection type image display apparatus and an image adjustment method capable of flexibly changing a display position of an image projected on a projection surface.
In a projection display apparatus 100, a projection unit 110 projects an image by controlling a liquid crystal panel 50 and a detection unit 240 configured to detect a projection frame 420 provided on a projection surface 400. And an element control unit 270 configured to move the position of the image 430 within the projectable range 410 that can be projected. The element control unit 270 controls the liquid crystal panel 50 so that the image 430 fits in the projection frame 420.
[Selection] Figure 3

Classifications

H04N9/3188 Scale or resolution adjustment
View 3 more classifications

Description

  The present invention relates to a projection type having a light modulation element configured to modulate light emitted from a light source, and a projection unit configured to project light emitted from the light modulation element onto a projection surface. The present invention relates to a video display device and an image adjustment method.

  2. Description of the Related Art Conventionally, there has been known a projection display apparatus including a light modulation element that modulates light emitted from a light source and a projection unit that projects light emitted from the light modulation element onto a projection surface.

  Here, it is conceivable that the range that can be projected by the projection display apparatus (projection unit) (hereinafter, the projectable range) does not coincide with the projection frame provided on the projection surface.

  On the other hand, a method has been proposed in which an image included in a projectable range is placed in a projection frame by the following procedure (for example, Patent Document 1). First, the projection display apparatus captures an image of the projection surface and specifies the coordinates of four corners of a projection frame (for example, a screen frame) provided on the projection surface. Secondly, the projection display apparatus specifies the coordinates of the four corners of the image projected on the projection surface. Thirdly, the projection display apparatus corrects the video signal so that the image fits in the projection frame based on the coordinates of the four corners of the projection frame and the coordinates of the four corners of the image.

JP 2008-251026 A

  However, since the projectable range is fixed in the above-described technique, even if an attempt is made to change the display position of the image on the projection plane, the display position of the image cannot be arbitrarily changed due to restrictions on the projectable range.

  Accordingly, the present invention has been made to solve the above-described problems, and a projection display apparatus and an image adjustment method that can flexibly change the display position of an image projected on a projection plane. The purpose is to provide.

  The projection display apparatus according to the first feature includes a light modulation element (liquid crystal panel 50) configured to modulate light emitted from a light source (light source 10), and light emitted from the light modulation element. A projection unit (projection unit 110) configured to project the image onto the projection surface. The projection display apparatus includes a detection unit (detection unit 240) configured to detect a projection frame provided on the projection plane, and a projectable range in which the projection unit can project an image. An element control unit (element control unit 270) configured to move a position of an image projected on the projection plane within the projectable range by controlling the light modulation element; The element control unit controls the light modulation element so that an image projected on the projection plane is accommodated in the projection frame.

  In the first feature, the element control unit projects an indicator indicating a movable direction of an image projected on the projection plane in the projection frame, or projects on the projection plane in the projection frame. The light modulation element is controlled to display an indicator indicating a direction in which an image can be enlarged or reduced.

  In the first feature, the projector further includes a projection unit control unit configured to move the position of the projectable range by the control of the projection unit, and the element control unit is configured to move the position of the projectable range. In conjunction with this, the light modulation element is controlled so that the image projected on the projection plane is accommodated in the projection frame.

  In the first feature, the element control unit does not change the center position of the image projected on the projection surface in accordance with the enlargement or reduction of the projectable range, and does not change the position on the projection surface in the projection possible range. Move the position of the projected image.

  In the first feature, the element control unit controls the light modulation element so as to display a candidate position where an image projected on the projection plane can be displayed within the projection frame.

  In the first feature, the detection unit detects the projection frame by detecting a detection target provided on the projection surface.

  In the first feature, the element control unit includes a first operation mode and a second operation mode when controlling the light modulation element, and the first operation mode is projected onto the projection plane. In the second operation mode, the image projected on the projection surface moves so as to reach the end of the movement range.

  In the first feature, the image processing apparatus further includes a calculation unit that specifies an overlapping range of the projectable range and the projection frame range, and the element control unit controls the light modulation element to display the overlapping range. .

  In the first feature, the element control unit modulates light so as to expand an unprojected area of the projection frame when an image projected on the projection plane is forced to move out of a movement range. Control the element.

  In the first feature, when the image projected on the projection plane is forced to move out of a movement range, the element control unit makes the image projected on the projection plane translucent. The light modulation element is controlled as described above.

  In the first feature, the remote controller that transmits a command for moving the position of the image projected on the projection plane by the element control unit, and the signal transmitted from the remote controller that is disposed at an opposite position are received. A first receiving unit that receives the command from the remote controller, and a second receiving unit that receives the command from the remote control. The light modulation element is controlled so that the moving direction of the image projected on the projection plane is opposite.

  ADVANTAGE OF THE INVENTION According to this invention, the projection type video display apparatus and image adjustment method which can change the display position of the image projected on a projection surface flexibly can be provided.

1 is a diagram showing an outline of a projection display apparatus 100 according to a first embodiment. 1 is a diagram illustrating a configuration of a projection display apparatus 100 according to a first embodiment. It is a block diagram which shows the control unit 200 which concerns on 1st Embodiment. It is a figure which shows an example of the memory | storage test pattern image which concerns on 1st Embodiment. It is a figure which shows an example of the memory | storage test pattern image which concerns on 1st Embodiment. It is a figure which shows an example of the memory | storage test pattern image which concerns on 1st Embodiment. It is a figure which shows an example of the memory | storage test pattern image which concerns on 1st Embodiment. It is a figure which shows an example of the imaging test pattern image which concerns on 1st Embodiment. It is a figure which shows an example of the imaging test pattern image which concerns on 1st Embodiment. It is a figure for demonstrating the method to calculate the intersection in the projection test pattern image which concerns on 1st Embodiment. It is a figure which shows the example of a display of the indicator which concerns on 1st Embodiment. It is a figure which shows the example of a display of the indicator which concerns on 1st Embodiment. It is a flowchart which shows operation | movement of the projection type video display apparatus 100 concerning 1st Embodiment. It is a figure which shows the reduction example of the image | video concerning the example 1 of a change. It is a figure which shows the reduction example of the image | video concerning the example 1 of a change. It is a figure which shows the example of an expansion of the image | video concerning the example 1 of a change. It is a figure which shows the example of an expansion of the image | video concerning the example 1 of a change. It is a figure which shows the example of a display of the candidate position which concerns on the example 2 of a change. It is a figure which shows the example of a display of the candidate position which concerns on the example 2 of a change. It is a figure which shows the example of a detection of the projection frame 420 which concerns on the example 3 of a change. It is a figure which shows the example of a detection of the projection frame 420 which concerns on the example 3 of a change. It is a figure which shows the example of a detection of the projection frame 420 which concerns on the example 3 of a change. It is a figure which shows the example of adjustment of the aspect ratio which concerns on the example 4 of a change. It is a figure which shows the example of adjustment of the aspect ratio which concerns on the example 4 of a change. It is a figure which shows the example of an necessity adjustment of the aspect ratio which concerns on the example 5 of a change. It is a figure which shows the example which arranges and displays the several image | video 430 which concerns on the example 6 of a change. It is a figure which shows the example which arranges and displays the several image | video 430 which concerns on the example 6 of a change. It is a figure which shows the example which the element control part 270 which concerns on the example 7 of a change readjusts the magnitude | size of the image | video 430 according to the aspect ratio of the image | video 430 having been changed during projection. It is a figure which shows the example which the element control part 270 which concerns on the example 7 of a change readjusts the magnitude | size of the image | video 430 according to the aspect ratio of the image | video 430 having been changed during projection. It is a figure which shows the operation mode example of the different movement amount which concerns on the example 8 of a change. It is a figure which shows the operation mode example of the different movement amount which concerns on the example 8 of a change. It is a figure which shows the example which displays the movable range 450 of the image | video 430 which concerns on the example 9 of a change. It is a figure which shows the case where a user moves the image | video 430 exceeding the movable limit which concerns on the example 10 of a change. It is a figure which shows the case where a user moves the image | video 430 exceeding the movable limit which concerns on the example 10 of a change. It is a figure which shows another implementation response which concerns on the example 10 of a change. It is a figure which shows the moving direction of the image | video 430 at the time of operating the direction key of the remote control 500 from the front and back of the projection type video display apparatus 100 concerning the example 11 of a change. It is a figure which shows another implementation response which concerns on the example 11 of a change. It is a figure which shows the interactive pen 600 provided with the remote control function to which the image | video 430 which concerns on the example 12 of a change is moved. It is a figure which shows another implementation response which concerns on the example 12 of a change.

  Hereinafter, a projection display apparatus according to an embodiment of the present invention will be described with reference to the drawings. In the following description of the drawings, the same or similar parts are denoted by the same or similar reference numerals.

  However, it should be noted that the drawings are schematic and ratios of dimensions and the like are different from actual ones. Therefore, specific dimensions and the like should be determined in consideration of the following description. Moreover, it is a matter of course that portions having different dimensional relationships and ratios are included between the drawings.

[Outline of Embodiment]
A projection display apparatus according to an embodiment is configured to project a light modulation element configured to modulate light emitted from a light source and light emitted from the light modulation element onto a projection surface. And a projection unit. The projection display apparatus moves a position of a projectable range, which is a range that can be projected by the projection unit, under the control of the detection unit configured to detect a projection frame provided on the projection surface and the projection unit. A projection unit control unit configured as described above, and an element control unit configured to move the position of an image projected on the projection plane within the projectable range by controlling the light modulation element. The element control unit controls the light modulation element so that an image projected on the projection plane is accommodated in the projection frame.

  As described above, in the embodiment, the image projected on the projection surface is controlled so as to be accommodated in the projection frame. Therefore, the display position of the image projected on the projection surface can be flexibly changed.

[First Embodiment]
(Outline of projection display device)
Hereinafter, the projection display apparatus according to the first embodiment will be described with reference to the drawings. FIG. 1 is a diagram showing an outline of a projection display apparatus 100 according to the first embodiment.

  As shown in FIG. 1, the projection display apparatus 100 is provided with an imaging device 300. The projection display apparatus 100 projects image light on the projection plane 400.

  The imaging device 300 is configured to image the projection plane 400. That is, the imaging apparatus 300 is configured to detect reflected light of the image light projected on the projection plane 400 by the projection display apparatus 100. The imaging device 300 may be built in the projection display apparatus 100 or may be provided with the projection display apparatus 100.

  The projection plane 400 is configured by a screen or the like. The range in which the projection display apparatus 100 can project image light (projectable range 410) is formed on the projection plane 400. Further, the projection plane 400 has a display frame constituted by an outer frame of the screen.

  In the first embodiment, a case where the optical axis N of the projection display apparatus 100 does not coincide with the normal line M of the projection plane 400 is illustrated. For example, a case where the optical axis N and the normal line M form an angle θ is illustrated.

  That is, in the first embodiment, since the optical axis N does not coincide with the normal line M, the projectable range 410 (image displayed on the projection plane 400) is distorted. In the first embodiment, a method for correcting such distortion in the projectable range 410 will be mainly described.

(Configuration of projection display device)
Hereinafter, the projection display apparatus according to the first embodiment will be described with reference to the drawings. FIG. 2 is a diagram illustrating a configuration of the projection display apparatus 100 according to the first embodiment.

  As shown in FIG. 2, the projection display apparatus 100 includes a projection unit 110 and an illumination device 120.

  The projection unit 110 projects the image light emitted from the illumination device 120 onto a projection surface (not shown).

  First, the illumination device 120 includes a light source 10, a UV / IR cut filter 20, a fly-eye lens unit 30, a PBS array 40, and a plurality of liquid crystal panels 50 (a liquid crystal panel 50R, a liquid crystal panel 50G, and a liquid crystal panel 50B). ) And a cross dichroic prism 60.

  The light source 10 is a light source that emits white light (for example, a UHP lamp or a xenon lamp). That is, the white light emitted from the light source 10 includes red component light R, green component light G, and blue component light B.

  The UV / IR cut filter 20 transmits visible light components (red component light R, green component light G, and blue component light B). The UV / IR cut filter 20 shields infrared light components and ultraviolet light components.

  The fly-eye lens unit 30 makes the light emitted from the light source 10 uniform. Specifically, the fly eye lens unit 30 includes a fly eye lens 31 and a fly eye lens 32. The fly-eye lens 31 and the fly-eye lens 32 are each composed of a plurality of minute lenses. Each microlens condenses the light emitted from the light source 10 so that the light emitted from the light source 10 is irradiated on the entire surface of the liquid crystal panel 50.

  The PBS array 40 aligns the polarization state of the light emitted from the fly-eye lens unit 30. For example, the PBS array 40 aligns the light emitted from the fly-eye lens unit 30 with S-polarized light (or P-polarized light).

The liquid crystal panel 50R modulates red component light R on the basis of the red output signal R out. An incident-side polarizing plate that transmits light having one polarization direction (for example, S-polarized light) and shields light having another polarization direction (for example, P-polarized light) on the side on which light is incident on the liquid crystal panel 50R. 52R is provided. On the side from which light is emitted from the liquid crystal panel 50R, the exit-side polarizing plate that blocks light having one polarization direction (for example, S-polarized light) and transmits light having another polarization direction (for example, P-polarized light). 53R is provided.

The liquid crystal panel 50G modulates the green component light G based on the green output signal Gout . An incident-side polarizing plate that transmits light having one polarization direction (for example, S-polarized light) and shields light having another polarization direction (for example, P-polarized light) on the side on which light enters the liquid crystal panel 50G. 52G is provided. On the other hand, on the side where the light is emitted from the liquid crystal panel 50G, light having one polarization direction (for example, S-polarized light) is shielded and light having another polarization direction (for example, P-polarized light) is transmitted. A side polarizing plate 53G is provided.

The liquid crystal panel 50B modulates blue component light B, based on the blue output signal B out. An incident side polarizing plate that transmits light having one polarization direction (for example, S-polarized light) and shields light having another polarization direction (for example, P-polarized light) on the side on which light is incident on the liquid crystal panel 50B. 52B is provided. On the other hand, on the side from which light is emitted from the liquid crystal panel 50B, light having one polarization direction (for example, S-polarized light) is shielded and light having another polarization direction (for example, P-polarized light) is transmitted. A side polarizing plate 53B is provided.

The red output signal Rout , the green output signal Gout, and the blue output signal Bout constitute a video output signal. The video output signal is a signal for each of a plurality of pixels constituting one frame.

  Here, each liquid crystal panel 50 may be provided with a compensation plate (not shown) for improving the contrast ratio and the transmittance. Each polarizing plate may have a pre-polarizing plate that reduces the amount of light incident on the polarizing plate and the thermal burden.

  The cross dichroic prism 60 constitutes a color combining unit that combines light emitted from the liquid crystal panel 50R, the liquid crystal panel 50G, and the liquid crystal panel 50B. The combined light emitted from the cross dichroic prism 60 is guided to the projection unit 110.

  2ndly, the illuminating device 120 has a mirror group (mirror 71-mirror 76) and a lens group (lens 81-lens 85).

  The mirror 71 is a dichroic mirror that transmits the blue component light B and reflects the red component light R and the green component light G. The mirror 72 is a dichroic mirror that transmits the red component light R and reflects the green component light G. The mirror 71 and the mirror 72 constitute a color separation unit that separates the red component light R, the green component light G, and the blue component light B.

  The mirror 73 reflects the red component light R, the green component light G, and the blue component light B, and guides the red component light R, the green component light G, and the blue component light B to the mirror 71 side. The mirror 74 reflects the blue component light B and guides the blue component light B to the liquid crystal panel 50B side. The mirror 75 and the mirror 76 reflect the red component light R and guide the red component light R to the liquid crystal panel 50R side.

  The lens 81 is a condenser lens that collects the light emitted from the PBS array 40. The lens 82 is a condenser lens that collects the light reflected by the mirror 73.

  The lens 83R collimates the red component light R so that the liquid crystal panel 50R is irradiated with the red component light R. The lens 83G collimates the green component light G so that the liquid crystal panel 50G is irradiated with the green component light G. The lens 83B collimates the blue component light B so that the liquid crystal panel 50B is irradiated with the blue component light B.

  The lens 84 and the lens 85 are relay lenses that substantially image the red component light R on the liquid crystal panel 50R while suppressing the expansion of the red component light R.

(Configuration of control unit)
The control unit according to the first embodiment will be described below with reference to the drawings. FIG. 3 is a block diagram showing the control unit 200 according to the first embodiment. The control unit 200 is provided in the projection display apparatus 100 and controls the projection display apparatus 100.

The control unit 200 converts the video input signal into a video output signal. The video input signal includes a red input signal R in , a green input signal G in, and a blue input signal B in . The video output signal includes a red output signal Rout , a green output signal Gout, and a blue output signal Bout . The video input signal and the video output signal are signals input for each of a plurality of pixels constituting one frame.

  As shown in FIG. 3, the control unit 200 includes a video signal receiving unit 210, a storage unit 220, a reading unit 230, a detection unit 240, a calculation unit 250, a projection unit control unit 260, and an element control unit 270. And have.

  The video signal receiving unit 210 receives a video input signal from an external device (not shown) such as a DVD or a TV tuner.

  The storage unit 220 stores various information. Specifically, the storage unit 220 stores a test pattern image constituting at least a part of each of three or more line segments constituting three or more intersections. Further, the three or more line segments have an inclination with respect to the predetermined reading direction.

  The predetermined reading direction is the direction of a predetermined line constituting the test pattern image. As will be described later, it should be noted that the readout unit 230 reads out the captured image corresponding to the predetermined line to the line buffer for each predetermined line constituting the test pattern image among the captured images captured by the imaging apparatus 300. It is.

Hereinafter, an example of the test pattern image will be described with reference to FIGS. As shown in FIGS. 4 to 6, the test pattern image is an image that constitutes at least a portion of four line segments which constitute the four intersections (P s 1~P s 4) ( L s 1~L s 4) It is. In the first embodiment, the four line segments (L s 1 to L s 4) are represented by a difference (edge) between light and shade or light and dark.

Specifically, as shown in FIG. 4, the test pattern image may have a black background and a white diamond. Here, the four sides of the white diamond form at least a part of four line segments (L s 1 to L s 4). Incidentally, the four line segments (L s 1 to L s 4) have an inclination relative to the predetermined read-out direction (horizontal direction).

Alternatively, as shown in FIG. 5, the test pattern image may be a black background and a white line segment. The white line segments constitute a part of the four sides of the white diamond shown in FIG. Here, the white line segment constitutes at least a part of four line segments (L s 1 to L s 4). Incidentally, the four line segments (L s 1 to L s 4) have an inclination relative to the predetermined read-out direction (horizontal direction).

Alternatively, as shown in FIG. 6, the test pattern image may be a black background and a pair of white triangles. Here, the two sides of the pair of white triangles constitute at least a part of the four line segments (L s 1 to L s 4). Incidentally, the four line segments (L s 1 to L s 4) have an inclination relative to the predetermined read-out direction (horizontal direction).

Alternatively, as shown in FIG. 7, the test pattern image may be a black background and a white line segment. Here, the white line segment constitutes at least a part of four line segments (L s 1 to L s 4). As shown in FIG. 7, four intersections formed by four line segments (L s 1~L s 4) ( P s 1~P s 4) may be provided outside the projection range 410 . Incidentally, the four line segments (L s 1 to L s 4) have an inclination relative to the predetermined read-out direction (horizontal direction).

  The reading unit 230 reads a captured image from the imaging device 300. Specifically, the reading unit 230 sequentially reads captured images of the test pattern image from the imaging device 300 along a predetermined reading direction in the test pattern image. In other words, the reading unit 230 includes a line buffer, and among the captured images captured by the imaging apparatus 300, the captured image corresponding to the predetermined line is stored in the line buffer for each predetermined line constituting the test pattern image. read out. That is, it should be noted that the reading unit 230 does not require a frame buffer.

  First, the detection unit 240 detects a display frame provided on the projection plane 400. As described above, the display frame is an outer frame of the screen.

  Specifically, the detection unit 240 may be configured to detect the four corners of the display frame. For example, the detection unit 240 detects the four corners of the display frame based on the captured images sequentially read out along the predetermined reading direction by the reading unit 230.

  Secondly, the detection unit 240 acquires three or more intersections in the captured image based on the captured images sequentially read along the predetermined reading direction by the reading unit 230.

  Specifically, the detection unit 240 acquires three or more intersections in the captured image by the following procedure. Here, a case where the test pattern image is the image (open diamond) shown in FIG. 4 is illustrated.

First, as illustrated in FIG. 8, the detection unit 240 acquires a point P edge having a difference in density (lightness or darkness) (edge) based on the captured image read into the line buffer by the reading unit 230. That is, the detection unit 240 acquires a point P edge group corresponding to the four sides of the white diamond in the test pattern image.

Secondly, as illustrated in FIG. 9, the detection unit 240 acquires four line segments (L t 1 to L t 4) in the captured image based on the point P edge group. That is, the detection unit 240 acquires the four line segments corresponding to the four line segments in the test pattern image (L s 1~L s 4) ( L t 1~L t 4).

Third, the detection unit 240, as shown in FIG. 9, based on the four line segments (L t 1 to L t 4), obtains the four intersections in the captured image (P t 1~P t 4) To do. That is, the detection unit 240 acquires the four intersections corresponding to the four intersections in the test pattern image (P s 1~P s 4) ( P t 1~P t 4).

The calculation unit 250 projects based on three or more intersections (for example, P s 1 to P s 4) in the test pattern image and three or more intersections (for example, P t 1 to P t 4) in the captured image. The positional relationship between the type image display device 100 and the projection plane 400 is calculated. Specifically, the calculation unit 250 calculates the amount of deviation between the optical axis N of the projection display apparatus 100 (projection unit 110) and the normal line M of the projection plane 400.

  Hereinafter, the test pattern image stored in the storage unit 220 is referred to as a stored test pattern image. A test pattern image included in the captured image is referred to as an captured test pattern image. The test pattern image projected on the projection plane 400 is referred to as a projected test pattern image.

First, the calculation unit 250 calculates the coordinates of four intersection points (P u 1 to P u 4) in the projection test pattern image. Here, the intersection P s 1 of the stored test pattern image, the intersection P t 1 of the captured test pattern image, and the intersection P u 1 of the projection test pattern image will be described as examples. The intersection point P s 1, the intersection point P t 1, and the intersection point P u 1 are intersection points corresponding to each other.

Hereinafter, a method of calculating the coordinates (X u 1, Y u 1, Z u 1) of the intersection point P u 1 will be described with reference to FIG. It should be noted that the coordinates (X u 1, Y u 1, Z u 1) of the intersection point P u 1 are coordinates in a three-dimensional space with the focal point O s of the projection display apparatus 100 as the origin.

(1) The calculation unit 250 performs a three-dimensional operation using the focal point O s of the projection display apparatus 100 as the origin for the coordinates (x s 1, y s 1) of the intersection point P s 1 in the two-dimensional plane of the stored test pattern image. It is converted into the coordinates (X s 1, Y s 1, Z s 1) of the intersection point P s 1 in space. Specifically, the coordinates (X s 1, Y s 1, Z s 1) of the intersection point P s 1 are represented by the following expression.

  Note that As is a 3 × 3 conversion matrix, and can be acquired in advance by preprocessing such as calibration. That is, As is a known parameter.

Here, planes perpendicular to the optical axis direction of the projection display apparatus 100 are represented by X s axis and Y s axis, the optical axis of the projection display apparatus 100 is represented by a Z s axis .

Similarly, the calculation unit 250 calculates the intersection point in the three-dimensional space with the focal point O t of the imaging apparatus 300 as the origin with respect to the coordinates (x t 1, y t 1) of the intersection point P t 1 in the two-dimensional plane of the imaging test pattern image. The coordinates are converted to the coordinates of P t 1 (X t 1, Y t 1, Z t 1).

  Note that At is a 3 × 3 conversion matrix, and can be acquired in advance by preprocessing such as calibration. That is, At is a known parameter.

Here, planes perpendicular to the optical axis of the image pickup apparatus 300 are represented by X t axis and Y t axis, orientation of the imaging apparatus 300 (image pickup direction) is represented by a Z t axis. It should be noted that the inclination (vector) of the orientation (imaging direction) of the imaging apparatus 300 is known in such a coordinate space.

(2) computation unit 250 computes a formula of a straight line L v connecting the intersection point P s 1 and the intersection P u 1. Similarly, computation unit 250 computes a formula of a straight line L w connecting the intersection point P t 1 and the intersection P u 1. Note that the formula of the straight line L v and the straight line L w are represented as follows.

(3) calculating unit 250 converts the straight line L w to the straight line L w 'in the three-dimensional space having an origin focus O s of the projection display apparatus 100. The straight line L w ′ is represented by the following equation.

  Since the optical axis of the projection display apparatus 100 and the orientation (imaging direction) of the imaging apparatus 300 are known, the parameter R indicating the rotation component is known. Similarly, since the relative positions of the projection display apparatus 100 and the imaging apparatus 300 are known, the parameter T indicating the translation component is also known.

(4) The calculation unit 250 calculates the intervening variables K s and K t at the intersection of the straight line L v and the straight line L w ′ (that is, the intersection point P u 1) based on the equations (3) and (5). . Subsequently, the calculation unit 250, intersection point P s 1 of the coordinates (X s 1, Y s 1 , Z s 1) and based on K s, the intersection P u 1 of the coordinates (X u 1, Y u 1 , Z The coordinates of u 1) are calculated. Alternatively, calculator 250, the intersection P t 1 coordinate (X t 1, Y t 1 , Z t 1) and on the basis of K t, the intersection P u 1 of the coordinates (X u 1, Y u 1 , Z u The coordinates of 1) are calculated.

Thereby, the calculation unit 250 calculates the coordinates (X u 1, Y u 1, Z u 1) of the intersection point P u 1. Similarly, the calculation unit 250 calculates the coordinates of the intersection point P u 2 (X u 2, Y u 2, Z u 2), the coordinates of the intersection point P u 3 (X u 3, Y u 3, Z u 3), and the intersection point P. u 4 coordinates (X u 4, Y u 4 , Z u 4) is calculated.

Second, the calculation unit 250 calculates a vector of the normal line M of the projection plane 400. Specifically, the calculation unit 250 calculates the vector of the normal line M of the projection plane 400 using the coordinates of at least three of the intersection points P u 1 to P u 4. The expression of the projection plane 400 is represented by the following expression, and the parameters k 1 , k 2 , and k 3 represent vectors of the normal line M of the projection plane 400.

  Accordingly, the calculation unit 250 can calculate the amount of deviation between the optical axis N of the projection display apparatus 100 and the normal line M of the projection plane 400. That is, the calculation unit 250 can calculate the positional relationship between the projection display apparatus 100 and the projection plane 400.

  The projection unit control unit 260 controls the projection unit 110. Specifically, the projection unit controller 260 is configured to enlarge or reduce the projectable range 410 (image) by controlling a lens group provided in the projection unit 110. Further, the projection unit control unit 260 is configured to move the position of the projectable range 410 (image) within the projection plane 400 by controlling the lens group provided in the projection unit 110.

  For example, the projection unit controller 260 controls a lens group provided in the projection unit 110 in accordance with a user operation using a user interface (not shown). Thereby, the projection unit control unit 260 enlarges, reduces, or moves the projectable range 410 (video).

  The element control unit 270 converts the video input signal into a video output signal, and controls the liquid crystal panel 50 based on the video output signal. The element control unit 270 has the following functions.

  First, the element control unit 270 has a function of automatically correcting the shape of the image projected on the projection plane 400 based on the positional relationship between the projection display apparatus 100 and the projection plane 400. That is, the element control unit 270 has a function of automatically performing keystone correction based on the positional relationship between the projection display apparatus 100 and the projection plane 400.

  Second, the element control unit 270 controls the liquid crystal panel 50 in conjunction with the control of the projection unit 110. For example, the element control unit 270 controls the liquid crystal panel 50 as described below.

  The element control unit 270 controls the liquid crystal panel 50 so that the image projected on the projection plane 400 fits in the projection frame in conjunction with the movement of the position of the projectable range. Specifically, the element control unit 270 acquires the movement amount and movement speed of the position in the projectable range from the projection unit control unit 260, and controls the liquid crystal panel 50 according to the movement amount and movement speed. For example, when the projectable range deviates from the projection frame with the movement of the projectable range, the element control unit 270 controls the liquid crystal panel 50 to move the position of the image within the projectable range. The image is kept in the projection frame.

  Third, the element control unit 270 controls the liquid crystal panel 50 so as to display an indicator indicating a direction in which an image projected on the projection plane 400 can move within the projection frame. For example, the indicator is an arrow indicating a direction in which an image can be moved in the horizontal direction or the vertical direction within the projection frame. The indicator may be an arrow indicating a direction in which an image can be moved in an oblique direction within the projection frame.

(Indicator display example)
Hereinafter, a display example of the indicator according to the first embodiment will be described with reference to the drawings. 11 and 12 are diagrams showing display examples of the indicator according to the first embodiment.

  As shown in FIG. 11, the projectable range 410 and the projection frame 420 have an overlapping area, and the image 430 is displayed in the overlapping area. The image 430 is located at the approximate center of the projectable range 410. In addition, the image 430 is located substantially at the center of the projection frame 420.

  Here, when the image 430 is positioned substantially at the center of the projection frame 420, the image 430 can be moved in the left direction and the right direction. An indicator indicating that it can be moved is displayed.

  Here, consider a case in which the user instructs to move the video 430 in the right direction within the projection frame 420 using a user interface or the like.

  In the first embodiment, as shown in FIGS. 11 and 12, the projection unit control unit 260 controls the projection unit 110 so that the position of the projectable range 410 moves in the lower right direction A. On the other hand, the element control unit 270 controls the liquid crystal panel 50 to move the display position of the image 430 in the upper right direction B in the projectable range 410 in conjunction with the movement of the position of the projectable range 410.

  On the other hand, consider a case where the user instructs to move the image 430 leftward within the projection frame 420 using a user interface or the like. In such a case, it is not necessary to move the projectable range 410, and the element control unit 270 moves the display position of the image 430 in the left direction in the projectable range 410 without interlocking with the movement of the position of the projectable range 410. The liquid crystal panel 50 is controlled to move to.

(Operation of projection display device)
Hereinafter, an operation of the projection display apparatus (control unit) according to the first embodiment will be described with reference to the drawings. FIG. 13 is a flowchart showing the operation of the projection display apparatus 100 (control unit 200) according to the first embodiment.

  As shown in FIG. 13, in step 10, the projection display apparatus 100 detects the projection frame 420. For example, the projection display apparatus 100 detects the projection frame 420 based on the imaging data of the imaging apparatus 300.

  In step 20, the projection display apparatus 100 adjusts the position of the projectable range 410 so that the projectable range 410 and the projection frame 420 overlap with each other under the control of the projection unit 110. Of course, the image 430 is included in the overlapping area of the projectable range 410 and the projection frame 420.

  In step 30, the projection display apparatus 100 displays a test pattern image. That is, the projection display apparatus 100 projects a test pattern image on the projection plane 400 by controlling the liquid crystal panel 50 or the like.

  In step 40, the imaging device 300 provided in the projection display apparatus 100 images the projection plane 400. That is, the imaging apparatus 300 captures a test pattern image projected on the projection plane 400.

  In step 50, the projection display apparatus 100 displays a preparation image. That is, the projection display apparatus 100 projects a preparation image on the projection plane 400 by controlling the liquid crystal panel 50 or the like.

  The preparation image may be, for example, a blue background image or a black background image.

  In step 60, the projection display apparatus 100 sequentially reads the captured images of the test pattern image from the imaging apparatus 300 along a predetermined reading direction in the test pattern image. Specifically, the projection display apparatus 100 reads out a captured image corresponding to a predetermined line from the captured image captured by the imaging apparatus 300 for each predetermined line constituting the test pattern image to the line buffer.

In step 70, the projection display apparatus 100 based on the captured image sequentially read along a predetermined reading direction, three or more intersection points in the captured image (e.g., P t 1 to P t shown in FIG. 9 4) is acquired.

In step 80, the projection display apparatus 100 projects the projection type based on the four intersection points (P s 1 to P s 4) in the test pattern image and the four intersection points (P t 1 to P t 4) in the captured image. The positional relationship between the video display device 100 and the projection plane 400 is calculated.

  In step 90, the projection display apparatus 100 displays an indicator indicating the direction in which the image projected on the projection plane 400 can move within the projection frame 420. That is, the projection display apparatus 100 projects an indicator under the control of the liquid crystal panel 50.

  In step 100, the projection display apparatus 100 receives a user operation for instructing movement of the image 430 using the user interface.

  In step 110, even if the projection display apparatus 100 moves the image 430 in the direction according to the user operation, the projection unit 110 so that the image 430 is included in the overlapping area of the projectable range 410 and the projection frame 420. With this control, the position of the projectable range 410 is moved.

  In step 120, the projection display apparatus 100 controls the liquid crystal panel 50 so that the image projected on the projection plane 400 fits in the projection frame 420 in conjunction with the movement of the position of the projectable range 410.

  Note that it is preferable that the processing of step 110 and step 120 is performed at the same time so that the image 430 moves smoothly in the direction according to the user operation while maintaining the state where the image 430 is within the projection frame 420. That is, it is preferable that the processing of step 110 and step 120 is performed simultaneously so as not to give the user a sense of incongruity.

  In step 130, the projection display apparatus 100 determines whether the image 430 has reached the projection frame 420. When the image 430 reaches the projection frame 420, the projection display apparatus 100 proceeds to the process of step 140. When the image 430 has not reached the projection frame 420, the projection display apparatus 100 returns to the process of step 90.

  In step 140, the projection display apparatus 100 ends the display of the indicator. For example, when the image 430 reaches the left end of the projection frame 420, the display of the indicator indicating that it can move in the left direction is ended. Alternatively, when the image 430 reaches the right end of the projection frame 420, the display of the indicator indicating that it can move in the right direction is ended.

(Function and effect)
In the first embodiment, the element control unit 270 controls the liquid crystal panel 50 so that the image 430 projected on the projection plane 400 fits in the projection frame 420 in conjunction with the movement of the position of the projectable range 410. Therefore, the display position of the image 430 projected on the projection plane 400 can be flexibly changed.

  It should be noted that if it is not necessary to move the position of the projectable range 410, it is only necessary to move the position of the image 430 within the projectable range 410.

  In the first embodiment, the element control unit 270 controls the liquid crystal panel 50 to display an indicator indicating a direction in which the image 430 can be moved in the projection frame 420. Therefore, the user can easily move the display position of the image 430 projected on the projection plane 400.

  In addition, since the image 430 in the projectable range 410 moves in conjunction with the movement of the position of the projectable range 410, even if the projectable range 410 does not overlap with the entire projection frame 420, The video 430 can be moved at. Further, the image 430 can be moved within the projection frame 420 even if the direction in which the projectable range 410 moves under the control of the projection unit 110 is different from the horizontal direction or the vertical direction.

[Modification 1]
Hereinafter, Modification Example 1 of the first embodiment will be described with reference to the drawings. In the following, differences from the first embodiment will be mainly described.

  Specifically, in the first modification, the element control unit 270 described above controls the liquid crystal panel 50 as described below in conjunction with the reduction or enlargement of the projectable range 410.

  First, the element control unit 270 moves the position of the image 430 within the projectable range 410 without changing the center position of the image 430 projected onto the projection plane 400 in conjunction with the reduction of the projectable range 410. To do. Specifically, the element control unit 270 acquires the reduction amount and reduction speed of the projectable range 410 from the projection unit control unit 260, and controls the liquid crystal panel 50 according to the reduction amount and reduction speed.

  Second, the element control unit 270 moves the position of the image 430 within the projectable range 410 without changing the center position of the image 430 projected onto the projection plane 400 in conjunction with the enlargement of the projectable range 410. To do. Specifically, the element control unit 270 acquires the enlargement amount and the enlargement speed of the projectable range 410 from the projection unit control unit 260, and controls the liquid crystal panel 50 according to the enlargement amount and the enlargement speed.

  For example, when the center position of the projectable range 410 is different from the center position of the image 430, when the projectable range 410 is enlarged or reduced, the image position is shifted within the projection frame 420. In such a case, the element control unit 270 controls the liquid crystal panel 50 to move the position of the image within the projectable range 410 without changing the center position of the image projected on the projection plane 400. maintain.

(Example of video reduction)
In the following, a video reduction example according to Modification 1 will be described with reference to the drawings. 14 and 15 are diagrams illustrating an example of video reduction according to the first modification.

  As shown in FIG. 14, the center X of the image 430 is shifted in the lower left direction with respect to the center Y of the projectable range 410. In such a case, when the projectable range 410 is reduced, the position of the image 430 is shifted in the upper right direction within the projection frame 420.

  In the first modification, as shown in FIGS. 14 and 15, the element control unit 270 moves the image 430 in the lower left direction C in the projectable range 410 in conjunction with the reduction of the projectable range 410. By controlling 50, the center position of the image 430 is maintained unchanged.

(Example of enlarged image)
Hereinafter, an enlarged example of the video according to the first modification will be described with reference to the drawings. 16 and 17 are diagrams illustrating an example of an enlarged image according to the first modification.

  As shown in FIG. 16, the center X of the image 430 is shifted in the lower left direction with respect to the center Y of the projectable range 410. In such a case, when the projectable range 410 is enlarged, the position of the image 430 is shifted in the lower left direction within the projection frame 420.

  In the first modification, as shown in FIGS. 16 and 17, the element control unit 270 moves the image 430 in the upper right direction D in the projectable range 410 in conjunction with the enlargement of the projectable range 410. By controlling 50, the center position of the image 430 is maintained unchanged.

  In the first modification, the case where the projectable range 410 is reduced or enlarged without changing the position of the projectable range 410 is illustrated. However, in the first modification, of course, the projectable range 410 may be reduced or enlarged while moving the position of the projectable range 410.

(Function and effect)
In the first modification, the element control unit 270 moves the position of the image 430 within the projectable range 410 without changing the center position of the image 430 in conjunction with the reduction or expansion of the projectable range 410. Thus, since the center position of the video 430 is maintained, the video 430 can be reduced or enlarged at a position desired by the user.

[Modification 2]
Hereinafter, Modification Example 2 of the first embodiment will be described with reference to the drawings. In the following, differences from the first embodiment will be mainly described.

  Specifically, in the second modification, the element control unit 270 described above controls the liquid crystal panel 50 so as to display candidate positions where the image 430 projected on the projection plane 400 can be displayed within the projection frame 420. The candidate position is represented by, for example, a dotted line surrounding a candidate area where the video 430 can be displayed.

(Example of candidate position display)
Hereinafter, a display example of candidate positions according to the modification example 2 will be described with reference to the drawings. 18 and 19 are diagrams illustrating display examples of candidate positions according to the second modification.

  As shown in FIG. 18, a plurality of candidate positions (here, candidate 1 and candidate 2) are displayed as candidate positions where the image 430 can be displayed in the projection frame 420.

  Here, consider a case where the user instructs to select candidate 2 using a user interface or the like.

  In the second modification, as shown in FIGS. 18 and 19, the image 430 is displayed on the candidate 2 in the projection frame 420.

(Function and effect)
In the second modification, the element control unit 270 controls the liquid crystal panel 50 so as to display a candidate position where the image 430 projected on the projection plane 400 can be displayed in the projection frame 420. Therefore, the user can easily move the image 430 within the projection frame 420.

  Since candidate positions are determined in advance, the calculation processing of the projection display apparatus 100 can be performed in advance. Therefore, the display position of the video 430 can be quickly moved.

[Modification 3]
Hereinafter, Modification Example 3 of the first embodiment will be described with reference to the drawings. In the following, differences from the first embodiment will be mainly described.

  Specifically, in the third modification, the above-described detection unit 240 detects the projection frame 420 by detecting the detection target provided on the projection plane 400.

  For example, as illustrated in FIG. 20, the detection unit 240 detects four detection targets 421 provided on the projection plane 400. As a result, the detection unit 240 detects the four corners of the projection frame 420.

  It is also conceivable that the region surrounded by the four detection targets 421 is not a rectangle (hereinafter referred to as a rectangular projection frame) constituted by a pair of sides extending in the horizontal direction and a pair of sides extending in the vertical direction. In such a case, the detection unit 240 detects the largest rectangular projection frame as the projection frame 420 within the region surrounded by the four detection targets 421.

  For example, as illustrated in FIGS. 21 and 22, when one detection target 421 is moved by the user, the detection unit 240 displays the largest rectangular projection frame within the region surrounded by the four detection targets 421. It is detected as a projection frame 420.

  The detection target 421 may be a marker provided on the projection plane 400. In such a case, the projection frame 420 is detected by detecting the marker. For example, when four markers are provided, the largest rectangular projection frame is detected as the projection frame 420 in the region surrounded by the four markers. When two markers are provided, of the four sides of the projection frame 420, two sides having intersections are detected by one marker, and the other two sides having intersections are detected by other markers. The

  Alternatively, the detection target 421 may be spot light irradiated on the projection plane 400 from a laser pointer or an infrared pointer. In such a case, as shown in FIGS. 21 and 22, the size of the projection frame 420 is changed by the movement of the spot light.

  Alternatively, the detection target 421 may be a user's hand. In such a case, as shown in FIGS. 21 and 22, the size of the projection frame 420 is changed by the movement of the user's hand or the like.

  Alternatively, the detection target 421 may be a paper surface provided on the projection surface 400. In such a case, the outer frame of the paper surface is detected as the projection frame 420.

  Alternatively, the detection target 421 may be a frame line drawn on the projection plane 400. In such a case, the frame line is detected as the projection frame 420.

(Function and effect)
In the third modification, the detection unit 240 detects the projection frame 420 by detecting the detection target 421 provided on the projection plane 400. Therefore, the detection of the projection frame 420 is easy. Further, the size of the projection frame 420 can be easily changed by moving the detection target 421 or the like.

[Modification 4]
Hereinafter, Modification Example 4 of the first embodiment will be described with reference to the drawings. In the following, differences from the first embodiment will be mainly described.

  Specifically, in the fourth modification, the element control unit 270 described above controls the liquid crystal panel 50 so as to adjust the aspect ratio of the image 430 according to the projection frame 420.

(Aspect ratio adjustment example)
Hereinafter, an example of adjusting the aspect ratio according to Modification 4 will be described with reference to the drawings. 23 and 24 are diagrams illustrating an example of adjusting the aspect ratio according to the fourth modification.

  As shown in FIG. 23, when the image 430 is set to be displayed on the entire projection frame 420, the image 430 is expanded in the horizontal direction or the vertical direction instead of the original aspect ratio of the image 430. The Alternatively, the video 430 is compressed in the horizontal direction or the vertical direction.

  Alternatively, as shown in FIG. 24, when the image 430 is set to be displayed with the original aspect ratio, the image 430 is enlarged so as to fit within the projection frame 420. Alternatively, the image 430 is reduced so as to fit within the projection frame 420.

  Note that the display method (aspect ratio) of the image 430 in the projection frame 420 is preferably set by the user using a user interface or the like.

(Function and effect)
In the modification example 4, the element control unit 270 controls the liquid crystal panel 50 to adjust the aspect ratio of the image 430 according to the projection frame 420. Therefore, a desired aspect ratio can be easily obtained.

[Modification 5]
Hereinafter, Modification 5 of the first embodiment will be described with reference to the drawings. In the following, differences from the first embodiment will be mainly described.

  Specifically, in the fifth modification, the element control unit 270 described above causes the user to select whether or not to adjust the aspect ratio of the video 430 according to the projection frame 420.

  Hereinafter, an example of adjusting the aspect ratio according to the fifth modification will be described with reference to the drawings. FIG. 25 is a diagram illustrating a display example of necessity / unnecessity of aspect ratio adjustment according to the fifth modification.

  As shown in FIG. 25, when the image 430 is set to be displayed in accordance with the aspect ratio of the projection frame 420, it is necessary to remarkably change the aspect ratio instead of the original aspect ratio of the image 430. is there. In this case, an image for selecting whether or not aspect ratio correction is necessary is superimposed on the projected image. The user instructs the projection display apparatus 100 whether or not the aspect ratio adjustment is necessary according to the instruction superimposed on the projection image.

  When the aspect ratio of the projection frame 420 and the original aspect ratio of the video 430 are substantially the same, the aspect ratio adjustment is performed without superimposing the video for selecting whether or not the aspect ratio correction is necessary on the projected video. Is preferred.

(Function and effect)
In the fifth modification, an image for selecting whether or not the aspect ratio correction is necessary is superimposed on the projected image, and the user is allowed to select whether or not the aspect ratio adjustment is necessary. Therefore, the user can determine and determine whether or not the aspect ratio adjustment is necessary according to the video 430 to be displayed, and the video is not viewed in a state significantly different from the original aspect ratio.

[Modification 6]
Hereinafter, Modification 6 of the first embodiment will be described with reference to the drawings. In the following, differences from the first embodiment will be mainly described.

  Specifically, in the modification example 6, the element control unit 270 described above displays a plurality of images 430 side by side in accordance with the projection frame 420.

  Hereinafter, an example in which a plurality of videos 430 according to Modification 6 are arranged will be described with reference to the drawings. 26 and 27 are diagrams illustrating an example in which a plurality of videos 430 according to Modification 6 are displayed side by side.

  As shown in FIGS. 26 and 27, when the image is displayed on the projection frame 420 with the original aspect ratio of the image 430, the non-display area may become large. Specifically, in the example of FIG. 26, the horizontal length of the projection frame 420 is twice or more the horizontal length of the image 430. In this case, two images 430 are displayed in the horizontal direction. In the example of FIG. 27, the length in the vertical direction of the projection frame 420 is at least twice the length in the vertical direction of the image 430. In this case, two videos 430 are displayed in the vertical direction.

  In the modified example 6, the images 430 having the same size are arranged in the horizontal direction or the vertical direction. However, the present invention is not limited to this. When two images are displayed, one image is displayed in a small size and the horizontal direction or the vertical direction. You may line up.

  Further, in Modification 6, the same image 430 is arranged in the horizontal direction or the vertical direction. However, the present invention is not limited to this. If two different video signals are input to the projection display apparatus 100, different images are displayed in the horizontal direction. You may arrange in a direction or a vertical direction.

  The number of images 430 arranged is not limited to two, and the length of the projection frame 420 in the horizontal or vertical direction is N times or more the horizontal or vertical length of the image 430 (N is a positive integer). In some cases, N images may be displayed in the horizontal or vertical direction.

(Function and effect)
In the modification example 6, when two or more images can be displayed when the projection frame 420 is displayed with the original aspect ratio of the image 430, two or more images 430 are displayed. Therefore, the non-display area can be effectively used even when a large number of non-display areas of the projection frame 420 are generated only by displaying the projection frame 420 with the original aspect ratio of the image 430.

[Modification 7]
Hereinafter, Modification Example 7 of the first embodiment will be described with reference to the drawings. In the following, differences from the first embodiment will be mainly described.

  Specifically, in the modification example 7, the element control unit 270 described above re-adjusts the size of the image 430 in response to the aspect ratio of the image 430 being changed during projection.

  Hereinafter, an example of readjustment of the aspect ratio according to the modified example 7 will be described with reference to the drawings. 28 and 29 are diagrams illustrating an example in which the element control unit 270 according to the modification example 7 re-adjusts the size of the image 430 in response to the aspect ratio of the image 430 being changed during projection.

  Assume that when the image 430 is projected, the original aspect ratio of the image 430 is changed due to a change in image content or the like. In such a case, if the aspect ratio is adjusted while maintaining the size of the video before the original aspect ratio is changed, the top and bottom of the video 430 are displayed in black as shown in FIG.

In the modification example 7, when the original aspect ratio of the video 430 is changed instead of displaying the top and bottom of the video 430 in black, the calculation unit 250 recalculates the optimal display area for the projection frame 420. The element control unit 270 re-adjusts the video 430 according to the recalculation result. (See Figure 29)
In Modification 7, it is needless to say that the projection frame 420 need not be redetected by merely recalculating the display area optimal for the projection frame 420 by the calculation unit 250.

(Function and effect)
In the modification example 7, when the original aspect ratio of the image 430 is changed, the display unit optimal for the projection frame 420 is recalculated by the calculation unit 250. Therefore, even if the original aspect ratio of the video 430 is changed, it can be displayed with an optimal display size. Further, since it is not necessary to redetect the projection frame 420 when the aspect is changed, the execution time can be shortened.

[Modification 8]
Hereinafter, Modification Example 8 of the first embodiment will be described with reference to the drawings. In the following, differences from the first embodiment will be mainly described.

  Specifically, the modification 8 includes a plurality of operation modes having different movement amounts for moving the video 430 when moving leftward or rightward.

  Hereinafter, an operation mode example of different movement amounts according to the modification example 8 will be described with reference to the drawings. 30 and 31 are diagrams illustrating examples of operation modes with different movement amounts according to the modification example 8. FIG.

  A first operation mode example will be described. As shown in FIG. 30, when moving the image 430 leftward or rightward, the image 430 moves leftward or rightward by a specific ratio of the length of the left-right direction of the image 430. For example, the length of 1/10 of the length of the video 430 in the left-right direction is set as a specific ratio, and the length of 1/10 of the length of the video 430 in the left-right direction is set as one step, and the image 430 is moved leftward or rightward.

  A second operation mode example will be described. As shown in FIG. 31, when moving the image 430 leftward or rightward, it moves to the end of the projectable range at once.

  These two operation modes are switched between the two operation modes depending on the pressing length of a user operation (for example, a direction key operation). Specifically, when moving to the left or right according to a specific ratio of the length in the left-right direction of the image 430, it operates when the length of time of pressing the direction key is short, up to the end of the projectable range. When moving at the same time, it operates when the length of time of pressing the direction key is long.

(Function and effect)
Since the modification example 8 includes two operation modes, that is, when moving leftward or rightward according to a specific ratio of the length in the left-right direction of the image 430 and when moving to the end of the projectable range at a time, the user can The desired moving position of the image 430 can be moved quickly.

  In the modified example 8, the two modes are switched depending on the length of time the direction key is pressed. However, the present invention is not limited to this, and the two operation modes can be selected by pressing a direction key other than the direction key indicating the moving direction. You may switch. Further, the image 430 may be moved to the end of the projectable range at a time by continuously inputting the direction key twice.

[Modification 9]
Hereinafter, Modification 9 of the first embodiment will be described with reference to the drawings. In the following, differences from the first embodiment will be mainly described.

  Specifically, in Modification 9, when the user performs a button operation for moving the video 430, the movable range 450 is superimposed on the projected video so that the user can know the movable range 450 of the video 430.

  Hereinafter, an example in which the movable range 450 of the video 430 according to the modification example 9 is displayed will be described with reference to the drawings. FIG. 32 is a diagram illustrating an example in which the movable range 450 of the video 430 according to the modification example 9 is displayed.

  As illustrated in FIG. 32, when the user detects a button operation for moving the image 430, the calculation unit 270 determines the range of the projection range 410 and the range of the projection frame 420 as the movable range 450. The element control unit 270 causes the movable range 450 determined by the calculation unit 270 to be superimposed on the image 430 and projected with a dotted line.

(Function and effect)
In the modification example 9, the range of the projection range 410 and the range of the projection frame 420 is determined as the movable range 450, and the movable range 450 is superimposed on the image 430 and projected with a dotted line. Therefore, it is possible to confirm whether or not the user can move to the position of the video 430 desired by the user at the same time as performing the button operation.

[Modification 10]
Hereinafter, Modification 10 of the first embodiment will be described with reference to the drawings. In the following, differences from the first embodiment will be mainly described.

  Specifically, in the tenth modification, when the image 430 reaches the movable limit, and the user wants to move the image 430 beyond the movable limit, the projection is performed by trimming the image 430. The unprojected area of the frame 420 is increased.

  Hereinafter, an example in which the user moves the video 430 beyond the movable limit of the video 430 according to the modification example 10 will be described with reference to the drawings. 33 and 34 are diagrams illustrating a case where the user moves the video 430 beyond the movable limit according to the tenth modification.

  As shown in FIG. 33, the image 430 has reached the movable range limit (here, described as being the same as the end of the projection frame 420). In this state, when a command to move the image 430 beyond the movable range is further performed by a user button operation, trimming is performed on the left image 430 that protrudes outside the projection frame 420 as shown in FIG.

(Function and effect)
The modified example 10 is effective when, for example, the user wants to use the image 430 by moving the image 430 to increase the unprojected area of the projection frame 420. In the state where the image 430 has reached the movable range limit, By performing the user's button operation, the unprojected area of the projection frame 420 can be increased. Therefore, when the user is making a presentation by projecting the image 430 on a whiteboard or the like, it is effective for describing a detailed description in an unprojected area of the whiteboard.

  As described in the tenth modification, the configuration for describing a detailed description on a whiteboard or the like is not limited to the tenth modification. For example, as shown in FIG. 35, the image 430 may be reduced and the unprojected area of the projection frame 420 may be increased.

  Further, by converting the image 430 into a translucent image and projecting the image onto the projection frame 420, the detailed description may be made conspicuous when the detailed description is described.

[Modification 11]
Hereinafter, Modification Example 11 of the first embodiment will be described with reference to the drawings. In the following, differences from the first embodiment will be mainly described.

  Specifically, in the modified example 11, when the movement of the image 430 is operated with the direction key of the remote controller 500, the operation is performed from the projection side (front) of the projection display apparatus 100, and the case of the projection display apparatus 100 is changed. When the operation is performed from the opposite side (rear) to the projection side, the image 430 moves in the opposite direction.

  Hereinafter, an example of the moving direction of the image 430 when the direction key of the remote controller 500 is operated from the front and rear of the projection display apparatus 100 according to the modification 11 will be described with reference to the drawings. FIG. 36 is a diagram illustrating the moving direction of the image 430 when the direction key of the remote controller 500 is operated from the front and rear of the projection display apparatus 100 according to the modification 11.

  As shown in FIG. 36, a front receiving unit 130 and a rear receiving unit 140 that receive an infrared signal from the remote controller 500 are provided in front and rear of the projection display apparatus 100.

  The front receiving unit 130 can receive an infrared signal from the front of the projection display apparatus 100 and cannot receive an infrared signal from the rear.

  The rear receiving unit 140 can receive an infrared signal from the rear of the projection display apparatus 100 and cannot receive an infrared signal from the front.

  The case where the image 430 is moved in the left direction when viewed from the projection display apparatus 100 will be described separately for the case where the remote controller 500 is located in front of the projection display apparatus 100 and the case where it is behind.

  A state in which the remote controller 500 is in front of the projection display apparatus 100 will be described. When the remote controller 500 is in front of the projection display apparatus 100 (A) and the right key of the remote controller 500 is pressed, the infrared signal from the remote controller 500 is received by the front receiving unit 130. The element control unit 270 controls the infrared signal of the right key received from the front receiving unit 130 to move the image 430 leftward.

  A state where the remote controller 500 is behind the projection display apparatus 100 will be described. When the left key of the remote controller 500 is pressed while the remote controller 500 is behind the projection display apparatus 100 (B), the infrared signal from the remote controller 500 is received by the rear receiver 130. The element control unit 270 controls the left key infrared signal received from the rear receiving unit 130 to move the image 430 leftward.

(Function and effect)
In the modified example 11, when the movement of the image 430 is operated with the direction key of the remote controller 500, the image 430 is displayed when operated from the front of the projection display apparatus 100 and when operated from the rear of the projection display apparatus 100. Move in reverse. Accordingly, when moving the image 430 while the user is in front of or behind the projection display apparatus 100, the direction in which the direction key of the remote controller 500 is operated is the same as the direction in which the image 430 moves. Intuitive remote control operation is possible.

  As described in the modification example 11, when the movement of the image 430 is operated with the direction key of the remote controller 500, the operation is performed from the front of the projection display apparatus 100, and the operation is performed from the rear of the projection display apparatus 100. In some cases, the video 430 moves in the reverse direction, but the present invention is not limited to this. For example, as shown in FIG. 37, the direction keys of the remote controller 500 may be given symbols A to D, and the direction and the moving direction of the video 430 may be associated with each other. Moreover, you may color-code instead of a code | symbol.

[Modification 12]
Hereinafter, Modification 12 of the first embodiment will be described with reference to the drawings. In the following, differences from the first embodiment will be mainly described.

  Specifically, in the twelfth modification, an interactive function is provided in which the projection plane 400 is always imaged by the imaging device 300, the trajectory of the dedicated interactive pen 600 is captured, and the trajectory is superimposed on the video 430. The interactive pen 600 functions as a remote controller that moves the video 430.

  Hereinafter, an example of the interactive pen 600 having a remote control function for moving the video 430 according to the modification 12 will be described with reference to the drawings. FIG. 38 is a diagram illustrating an interactive pen 600 having a remote control function for moving the video 430 according to the twelfth modification.

  The projection display apparatus 100 according to the modification 12 always uses the imaging device 300 to capture an image of the projection surface 400, and the dedicated interactive pen 600 as shown in FIG. 38 moves on the projection surface 400. The locus drawn by the interactive pen 600 is superimposed on the video 430.

  The interactive pen 600 has a mode switching button 610 and a direction button 620.

  The mode switching button 610 is a button for alternately switching between an interactive mode in which the interactive function is activated and a display position movement mode in which the video 430 is moved.

  The direction button 620 is used to move the video 430 when the display position movement mode is switched, and has a pair of direction buttons. When one of the pair of direction buttons of the direction button 620 is pressed, the image 430 moves.

(Function and effect)
In the modification example 12, the interactive pen 600 includes the mode switching button 610, so that the interactive mode and the display position movement mode can be switched. Accordingly, it is not necessary to have a remote control or the like in addition to the interactive pen when moving the video 430, so that the usability of the user is improved.

  In the modification example 12, the movement of the image 430 is performed using the direction button 620 of the interactive pen 600. However, the movement is not limited to this, and the interactive pen 600 detects a rotation as shown in FIG. A rotation detection sensor 630 may be provided to control the movement of the image 430.

  In the modification example 12, the movement of the video 430 is performed using the direction button 620 of the interactive pen 600. However, the movement is not limited to this, and the video 430 is detected by detecting the trajectory information of the interactive pen 600 and using the detection result. You may control to move.

  In the modified example 12, a pen-type device is used as the interactive pen 600. However, the present invention is not limited to this, and the imaging apparatus 300 detects the laser beam from the laser pointer and uses the laser pointer-type device. May be used for

  Further, when moving the image 430 using the interactive pen 600, in order to specify whether the image 430 is moved in the left-right direction or the up-down direction, the movable range in the left-right direction and the up-down direction are specified. The movable range may be compared to specify the uniaxial direction in which the video 430 can move greatly, and the video 430 may be moved only in this uniaxial direction.

  In addition, when information (for example, a clock) is displayed in addition to the position where the video 430 is displayed, information other than the video 430 maintains a relative positional relationship with the video 430 as the video 430 moves. You may make it move simultaneously.

[Other Embodiments]
Although the present invention has been described with reference to the above-described embodiments, it should not be understood that the descriptions and drawings constituting a part of this disclosure limit the present invention. From this disclosure, various alternative embodiments, examples and operational techniques will be apparent to those skilled in the art.

  In the embodiment described above, a white light source is exemplified as the light source. However, the light source may be an LED (Light Emitting Diode) or an LD (Laser Diode).

  In the above-described embodiment, the transmissive liquid crystal panel is exemplified as the light modulation element. However, the light modulation element may be a reflective liquid panel or a DMD (Digital Micromirror Device).

  In the above-described embodiment, the test pattern images shown in FIGS. However, the test pattern image is not limited to this. Further, the case where the reading unit 230 has a line memory is illustrated. However, the reading unit 230 may have a frame memory.

  In the embodiment described above, the detection unit 240 detects the projection frame 420 based on the captured image of the imaging device 300. However, the detection unit 240 may be a sensor (a light amount sensor, an infrared sensor, or the like) that detects spot light irradiated on the projection plane 400 from a laser pointer or an infrared pointer.

  In the above-described embodiment, the element control unit 270 has been described as having a function of automatically performing keystone correction based on the positional relationship between the projection display apparatus 100 and the projection plane 400. However, the embodiment is not limited to this. For example, the element control unit 270 may adjust the focus position and the zoom magnification based on the positional relationship between the projection display apparatus 100 and the projection plane 400.

  In the first embodiment described above, the indicator indicating the direction in which the video 430 can be moved is an arrow, but the embodiment is not limited to this. The indicator may be a character or the like.

  Although not particularly mentioned in the first embodiment described above, when the image 430 is likely to reach the projection frame 420, the color of the indicator may be changed to a specific color (for example, red). By changing the color of the indicator, the user can be informed that the limit for moving the video 430 is approaching. Alternatively, the user may be notified by text or the like that the limit for moving the video 430 is approaching.

  In the first embodiment described above, the indicator indicating the direction in which the image projected on the projection plane 400 can move within the projection frame is illustrated. However, the embodiment is not limited to this. The indicator may be an indicator that indicates the direction in which the image projected on the projection plane 400 can be enlarged within the projection frame, or the direction in which the image projected on the projection plane 400 can be reduced within the projection frame. It may be an indicator.

  DESCRIPTION OF SYMBOLS 10 ... Light source, 20 ... UV / IR cut filter, 30 ... Fly eye lens unit, 40 ... PBS array, 50 ... Liquid crystal panel, 52, 53 ... Polarizing plate, 60 ... Cross dichroic cube, 71-76 ... Mirror, 81- 85 ... Lens, 100 ... Projection-type image display device, 110 ... Projection unit, 120 ... Lighting unit, 200 ... Control unit, 210 ... Video signal receiving unit, 220 ... Storage unit, 230 ... Reading unit, 240 ... Detection unit, 250 ... Calculation unit, 260 ... Generation unit, 270 ... Element control unit, 280 ... Specification unit, 300 ... Imaging device, 400 ... Projection plane, 410 ... Projectable range, 420 ... Projection frame, 430 ... Video

Claims (11)
Hide Dependent

  1. A projection display apparatus comprising: a light modulation element configured to modulate light emitted from a light source; and a projection unit configured to project light emitted from the light modulation element onto a projection plane Because
    A detection unit configured to detect a projection frame provided on the projection surface;
    An element control unit configured to move a position of an image projected on the projection plane by controlling the light modulation element within a projectable range in which the projection unit can project an image;
    The element control unit controls the light modulation element so that an image projected on the projection plane is accommodated in the projection frame.
  2.   The element control unit is an indicator that indicates a direction in which an image projected on the projection plane can move within the projection frame, or an image projected on the projection plane within the projection frame can be enlarged or reduced. The projection display apparatus according to claim 1, wherein the light modulation element is controlled so as to display an indicator indicating a specific direction.
  3. A projection unit controller configured to move the position of the projectable range by controlling the projection unit;
    The element control unit controls the light modulation element so that an image projected on the projection plane fits in the projection frame in conjunction with the movement of the position of the projectable range. The projection display apparatus according to claim 1 or 2.
  4.   The element control unit may change an image projected on the projection surface in the projectable range without changing a center position of the image projected on the projection surface in accordance with the enlargement or reduction of the projectable range. The projection image display apparatus according to claim 1, wherein the light modulation element is controlled to move a position.
  5.   The projection according to claim 1, wherein the element control unit controls the light modulation element so as to display a candidate position where an image projected on the projection plane can be displayed within the projection frame. Type image display device.
  6.   The projection display apparatus according to claim 1, wherein the detection unit detects the projection frame by detecting a detection target provided on the projection surface.
  7. The element control unit includes a first operation mode and a second operation mode when controlling the light modulation element,
    The first operation mode is characterized in that an image projected on the projection plane moves in a specific movement step,
    3. The projection display apparatus according to claim 1, wherein in the second operation mode, the image projected on the projection plane moves so as to reach an end of a moving range. 4.
  8. A calculation unit that identifies an overlapping range of the projectable range and the projection frame range;
    The projection image display apparatus according to claim 1, wherein the element control unit controls the light modulation element to display the overlapping range.
  9.   The element control unit controls the light modulation element to enlarge an unprojected area of the projection frame when an image projected on the projection plane is forced to move out of a movement range. The projection display apparatus according to claim 1 or 2, wherein the projection display apparatus is characterized.
  10.   The element control unit is configured so that the image projected on the projection plane is translucent when the image projected on the projection plane is forced to move out of a moving range. The projection display apparatus according to claim 1 or 2, wherein the projection type image display apparatus is controlled.
  11. A remote control for transmitting a command to move the position of the image projected on the projection surface by the element control unit;
    A first receiver and a second receiver arranged at opposite positions and receiving a signal transmitted from the remote controller;
    In the element control unit, the moving direction of the image projected on the projection plane is reverse when the command from the remote controller is received by the first receiving unit and when the command is received by the second receiving unit. The projection image display apparatus according to claim 1, wherein the light modulation element is controlled so as to become.